Dec 09 11:31:52 crc systemd[1]: Starting Kubernetes Kubelet... Dec 09 11:31:52 crc restorecon[4730]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:52 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 11:31:53 crc restorecon[4730]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 11:31:53 crc restorecon[4730]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 09 11:31:53 crc kubenswrapper[4745]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 11:31:53 crc kubenswrapper[4745]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 09 11:31:53 crc kubenswrapper[4745]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 11:31:53 crc kubenswrapper[4745]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 11:31:53 crc kubenswrapper[4745]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 09 11:31:53 crc kubenswrapper[4745]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.378716 4745 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.386783 4745 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.386954 4745 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.387036 4745 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.387101 4745 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.387167 4745 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.387229 4745 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.387289 4745 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.387357 4745 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.387428 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.387489 4745 feature_gate.go:330] unrecognized feature gate: Example Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.387599 4745 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.387678 4745 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.387740 4745 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.387811 4745 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.387877 4745 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.387947 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.388007 4745 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.388066 4745 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.388124 4745 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.388196 4745 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.388264 4745 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.388326 4745 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.388391 4745 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.388458 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.388552 4745 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.388620 4745 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.388694 4745 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.388760 4745 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.388831 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.388892 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.388951 4745 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.389026 4745 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.389090 4745 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.389159 4745 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.389230 4745 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.389293 4745 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.389358 4745 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.389420 4745 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.389479 4745 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.389572 4745 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.389640 4745 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.389707 4745 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.389769 4745 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.389843 4745 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.389908 4745 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.389969 4745 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.390040 4745 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.390111 4745 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.390176 4745 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.390245 4745 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.390307 4745 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.390400 4745 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.390467 4745 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.390561 4745 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.390639 4745 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.390710 4745 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.390791 4745 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.390863 4745 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.390925 4745 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.390985 4745 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.391052 4745 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.391119 4745 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.391182 4745 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.391242 4745 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.391302 4745 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.391361 4745 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.391427 4745 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.391490 4745 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.391600 4745 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.391673 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.391743 4745 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.392125 4745 flags.go:64] FLAG: --address="0.0.0.0" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.392223 4745 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.392311 4745 flags.go:64] FLAG: --anonymous-auth="true" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.392379 4745 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.392452 4745 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.392560 4745 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.392652 4745 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.392757 4745 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.392839 4745 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.392907 4745 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.392972 4745 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.393058 4745 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.393126 4745 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.393190 4745 flags.go:64] FLAG: --cgroup-root="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.393253 4745 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.393315 4745 flags.go:64] FLAG: --client-ca-file="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.393385 4745 flags.go:64] FLAG: --cloud-config="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.393449 4745 flags.go:64] FLAG: --cloud-provider="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.393557 4745 flags.go:64] FLAG: --cluster-dns="[]" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.393666 4745 flags.go:64] FLAG: --cluster-domain="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.393746 4745 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.393821 4745 flags.go:64] FLAG: --config-dir="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.393889 4745 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.393953 4745 flags.go:64] FLAG: --container-log-max-files="5" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.394017 4745 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.394077 4745 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.394145 4745 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.394219 4745 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.394283 4745 flags.go:64] FLAG: --contention-profiling="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.394388 4745 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.394459 4745 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.394549 4745 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.394618 4745 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.394700 4745 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.394765 4745 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.394845 4745 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.394930 4745 flags.go:64] FLAG: --enable-load-reader="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.394998 4745 flags.go:64] FLAG: --enable-server="true" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.395061 4745 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.395175 4745 flags.go:64] FLAG: --event-burst="100" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.395265 4745 flags.go:64] FLAG: --event-qps="50" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.395339 4745 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.395412 4745 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.395482 4745 flags.go:64] FLAG: --eviction-hard="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.395584 4745 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.395661 4745 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.395747 4745 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.395815 4745 flags.go:64] FLAG: --eviction-soft="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.395881 4745 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.395953 4745 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.396018 4745 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.396087 4745 flags.go:64] FLAG: --experimental-mounter-path="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.396153 4745 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.396222 4745 flags.go:64] FLAG: --fail-swap-on="true" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.396286 4745 flags.go:64] FLAG: --feature-gates="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.396359 4745 flags.go:64] FLAG: --file-check-frequency="20s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.396429 4745 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.396500 4745 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.396604 4745 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.396678 4745 flags.go:64] FLAG: --healthz-port="10248" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.396742 4745 flags.go:64] FLAG: --help="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.396809 4745 flags.go:64] FLAG: --hostname-override="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.396877 4745 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.396949 4745 flags.go:64] FLAG: --http-check-frequency="20s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.397014 4745 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.397077 4745 flags.go:64] FLAG: --image-credential-provider-config="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.397140 4745 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.397203 4745 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.397265 4745 flags.go:64] FLAG: --image-service-endpoint="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.397338 4745 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.397429 4745 flags.go:64] FLAG: --kube-api-burst="100" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.397556 4745 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.397648 4745 flags.go:64] FLAG: --kube-api-qps="50" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.397714 4745 flags.go:64] FLAG: --kube-reserved="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.397778 4745 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.397846 4745 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.397911 4745 flags.go:64] FLAG: --kubelet-cgroups="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.397975 4745 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.398046 4745 flags.go:64] FLAG: --lock-file="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.398110 4745 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.398180 4745 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.398245 4745 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.398397 4745 flags.go:64] FLAG: --log-json-split-stream="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.398471 4745 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.398596 4745 flags.go:64] FLAG: --log-text-split-stream="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.398676 4745 flags.go:64] FLAG: --logging-format="text" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.398742 4745 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.398806 4745 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.398877 4745 flags.go:64] FLAG: --manifest-url="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.398941 4745 flags.go:64] FLAG: --manifest-url-header="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.399014 4745 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.399086 4745 flags.go:64] FLAG: --max-open-files="1000000" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.399243 4745 flags.go:64] FLAG: --max-pods="110" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.399311 4745 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.399382 4745 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.399447 4745 flags.go:64] FLAG: --memory-manager-policy="None" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.399547 4745 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.399619 4745 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.399683 4745 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.399762 4745 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.399846 4745 flags.go:64] FLAG: --node-status-max-images="50" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.399911 4745 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.399974 4745 flags.go:64] FLAG: --oom-score-adj="-999" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.400044 4745 flags.go:64] FLAG: --pod-cidr="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.400107 4745 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.400184 4745 flags.go:64] FLAG: --pod-manifest-path="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.400253 4745 flags.go:64] FLAG: --pod-max-pids="-1" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.400318 4745 flags.go:64] FLAG: --pods-per-core="0" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.400381 4745 flags.go:64] FLAG: --port="10250" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.400443 4745 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.400541 4745 flags.go:64] FLAG: --provider-id="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.400610 4745 flags.go:64] FLAG: --qos-reserved="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.400673 4745 flags.go:64] FLAG: --read-only-port="10255" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.400754 4745 flags.go:64] FLAG: --register-node="true" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.400818 4745 flags.go:64] FLAG: --register-schedulable="true" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.400881 4745 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.400950 4745 flags.go:64] FLAG: --registry-burst="10" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.401013 4745 flags.go:64] FLAG: --registry-qps="5" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.401089 4745 flags.go:64] FLAG: --reserved-cpus="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.401156 4745 flags.go:64] FLAG: --reserved-memory="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.401229 4745 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.401294 4745 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.401357 4745 flags.go:64] FLAG: --rotate-certificates="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.401434 4745 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.401542 4745 flags.go:64] FLAG: --runonce="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.401617 4745 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.401681 4745 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.401756 4745 flags.go:64] FLAG: --seccomp-default="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.401828 4745 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.401891 4745 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.401955 4745 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.402023 4745 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.402095 4745 flags.go:64] FLAG: --storage-driver-password="root" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.402159 4745 flags.go:64] FLAG: --storage-driver-secure="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.402238 4745 flags.go:64] FLAG: --storage-driver-table="stats" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.402338 4745 flags.go:64] FLAG: --storage-driver-user="root" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.402414 4745 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.402479 4745 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.402616 4745 flags.go:64] FLAG: --system-cgroups="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.402683 4745 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.402752 4745 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.402816 4745 flags.go:64] FLAG: --tls-cert-file="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.402898 4745 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.402970 4745 flags.go:64] FLAG: --tls-min-version="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.403033 4745 flags.go:64] FLAG: --tls-private-key-file="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.403097 4745 flags.go:64] FLAG: --topology-manager-policy="none" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.403161 4745 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.403226 4745 flags.go:64] FLAG: --topology-manager-scope="container" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.403291 4745 flags.go:64] FLAG: --v="2" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.403370 4745 flags.go:64] FLAG: --version="false" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.403438 4745 flags.go:64] FLAG: --vmodule="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.403504 4745 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.403594 4745 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.403825 4745 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.403897 4745 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.403975 4745 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.404051 4745 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.404116 4745 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.404179 4745 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.404246 4745 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.404315 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.404378 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.404444 4745 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.404536 4745 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.404620 4745 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.404690 4745 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.404754 4745 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.404822 4745 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.404893 4745 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.404958 4745 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.405030 4745 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.405094 4745 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.405164 4745 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.405228 4745 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.405296 4745 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.405359 4745 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.405439 4745 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.405546 4745 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.405613 4745 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.405686 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.405755 4745 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.405823 4745 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.405884 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.405945 4745 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.406012 4745 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.406075 4745 feature_gate.go:330] unrecognized feature gate: Example Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.406154 4745 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.406227 4745 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.406290 4745 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.406373 4745 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.406444 4745 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.406598 4745 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.406667 4745 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.406742 4745 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.406810 4745 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.406884 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.406947 4745 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.407008 4745 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.407075 4745 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.407137 4745 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.407198 4745 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.407258 4745 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.407324 4745 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.407386 4745 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.407460 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.407555 4745 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.407623 4745 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.407684 4745 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.407784 4745 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.407853 4745 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.407925 4745 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.408007 4745 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.408077 4745 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.408145 4745 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.408207 4745 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.408267 4745 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.408327 4745 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.408397 4745 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.408467 4745 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.408567 4745 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.408635 4745 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.408697 4745 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.408758 4745 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.408830 4745 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.408902 4745 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.415987 4745 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.416038 4745 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416126 4745 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416136 4745 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416143 4745 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416149 4745 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416154 4745 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416160 4745 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416166 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416171 4745 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416177 4745 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416182 4745 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416187 4745 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416193 4745 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416198 4745 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416204 4745 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416209 4745 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416216 4745 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416227 4745 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416233 4745 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416240 4745 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416247 4745 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416253 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416259 4745 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416265 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416270 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416276 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416281 4745 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416286 4745 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416292 4745 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416297 4745 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416302 4745 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416307 4745 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416312 4745 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416317 4745 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416323 4745 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416329 4745 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416335 4745 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416340 4745 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416346 4745 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416351 4745 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416356 4745 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416363 4745 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416370 4745 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416376 4745 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416383 4745 feature_gate.go:330] unrecognized feature gate: Example Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416392 4745 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416399 4745 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416405 4745 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416410 4745 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416415 4745 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416420 4745 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416425 4745 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416432 4745 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416439 4745 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416445 4745 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416450 4745 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416456 4745 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416461 4745 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416466 4745 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416471 4745 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416476 4745 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416482 4745 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416487 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416492 4745 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416497 4745 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416502 4745 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416526 4745 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416532 4745 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416537 4745 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416542 4745 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416547 4745 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416553 4745 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.416563 4745 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416716 4745 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416727 4745 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416733 4745 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416740 4745 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416747 4745 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416755 4745 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416761 4745 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416767 4745 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416773 4745 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416780 4745 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416788 4745 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416794 4745 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416800 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416806 4745 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416811 4745 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416817 4745 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416822 4745 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416827 4745 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416832 4745 feature_gate.go:330] unrecognized feature gate: Example Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416837 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416842 4745 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416848 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416853 4745 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416858 4745 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416864 4745 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416870 4745 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416877 4745 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416883 4745 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416888 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416893 4745 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416898 4745 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416904 4745 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416910 4745 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416917 4745 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416923 4745 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416928 4745 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416933 4745 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416939 4745 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416944 4745 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416949 4745 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416954 4745 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416959 4745 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416964 4745 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416969 4745 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416975 4745 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416980 4745 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416985 4745 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416991 4745 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.416997 4745 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417002 4745 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417007 4745 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417012 4745 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417018 4745 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417023 4745 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417028 4745 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417033 4745 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417039 4745 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417044 4745 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417050 4745 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417055 4745 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417060 4745 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417065 4745 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417071 4745 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417076 4745 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417081 4745 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417086 4745 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417091 4745 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417096 4745 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417102 4745 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417107 4745 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.417113 4745 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.417123 4745 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.417323 4745 server.go:940] "Client rotation is on, will bootstrap in background" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.420342 4745 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.420433 4745 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.421080 4745 server.go:997] "Starting client certificate rotation" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.421099 4745 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.421280 4745 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-21 03:50:50.011733601 +0000 UTC Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.421443 4745 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.429536 4745 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.431281 4745 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 09 11:31:53 crc kubenswrapper[4745]: E1209 11:31:53.432074 4745 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.439672 4745 log.go:25] "Validated CRI v1 runtime API" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.466144 4745 log.go:25] "Validated CRI v1 image API" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.468717 4745 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.471316 4745 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-09-11-27-23-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.471452 4745 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.490331 4745 manager.go:217] Machine: {Timestamp:2025-12-09 11:31:53.488861352 +0000 UTC m=+0.314062916 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:96c45dda-f1d0-4fb2-b98c-edc8f5390e21 BootID:9cf46efc-ff1e-4018-a4c0-ef197e8adebf Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:9a:ee:02 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:9a:ee:02 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:43:80:db Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:31:ff:51 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:7f:9d:55 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:dd:10:08 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:e4:23:2b Speed:-1 Mtu:1496} {Name:eth10 MacAddress:12:cf:33:da:34:40 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ce:8c:85:93:d6:00 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.490886 4745 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.491176 4745 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.491701 4745 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.492034 4745 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.492162 4745 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.492538 4745 topology_manager.go:138] "Creating topology manager with none policy" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.492639 4745 container_manager_linux.go:303] "Creating device plugin manager" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.492910 4745 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.493016 4745 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.493445 4745 state_mem.go:36] "Initialized new in-memory state store" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.493654 4745 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.494581 4745 kubelet.go:418] "Attempting to sync node with API server" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.494706 4745 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.494834 4745 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.494956 4745 kubelet.go:324] "Adding apiserver pod source" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.495141 4745 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.497433 4745 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.497967 4745 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.499057 4745 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.499039 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Dec 09 11:31:53 crc kubenswrapper[4745]: E1209 11:31:53.499319 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.499080 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Dec 09 11:31:53 crc kubenswrapper[4745]: E1209 11:31:53.499367 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.499920 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.500112 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.500213 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.500307 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.500394 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.500463 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.500580 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.500655 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.500723 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.500791 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.500874 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.500955 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.501192 4745 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.501792 4745 server.go:1280] "Started kubelet" Dec 09 11:31:53 crc systemd[1]: Started Kubernetes Kubelet. Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.502524 4745 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.504878 4745 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.505824 4745 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.506474 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Dec 09 11:31:53 crc kubenswrapper[4745]: E1209 11:31:53.507257 4745 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.45:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187f88ba7d67a76b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 11:31:53.501763435 +0000 UTC m=+0.326964969,LastTimestamp:2025-12-09 11:31:53.501763435 +0000 UTC m=+0.326964969,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.508787 4745 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.508858 4745 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.510198 4745 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 16:12:06.211557435 +0000 UTC Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.510299 4745 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 100h40m12.701265966s for next certificate rotation Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.510637 4745 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.510693 4745 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 09 11:31:53 crc kubenswrapper[4745]: E1209 11:31:53.510651 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.512676 4745 server.go:460] "Adding debug handlers to kubelet server" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.513263 4745 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 09 11:31:53 crc kubenswrapper[4745]: E1209 11:31:53.513352 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" interval="200ms" Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.513247 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Dec 09 11:31:53 crc kubenswrapper[4745]: E1209 11:31:53.513408 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.513570 4745 factory.go:55] Registering systemd factory Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.513585 4745 factory.go:221] Registration of the systemd container factory successfully Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.515229 4745 factory.go:153] Registering CRI-O factory Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.515257 4745 factory.go:221] Registration of the crio container factory successfully Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.515328 4745 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.515370 4745 factory.go:103] Registering Raw factory Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.515388 4745 manager.go:1196] Started watching for new ooms in manager Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.516041 4745 manager.go:319] Starting recovery of all containers Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524649 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524698 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524716 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524729 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524741 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524754 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524802 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524815 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524829 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524842 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524854 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524866 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524878 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524892 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524903 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524916 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524929 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524943 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524955 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524966 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524976 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.524989 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525001 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525012 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525023 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525035 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525051 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525064 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525076 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525089 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525101 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525113 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525147 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525159 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525171 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525183 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525194 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525208 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525221 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525234 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525247 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525258 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525270 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525281 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525293 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525305 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525317 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525330 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525344 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525356 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525367 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525380 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525397 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525410 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525424 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525436 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525449 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525461 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525474 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525486 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525497 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525525 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525538 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525551 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525562 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525576 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525588 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525599 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525609 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525621 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525632 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525644 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525655 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525666 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525676 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525688 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525699 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525711 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525723 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525734 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525747 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525761 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525776 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525788 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525799 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525816 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525827 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525841 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525853 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525864 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525875 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525887 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525898 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525910 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525921 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525933 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525943 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525957 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.525968 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.526608 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.528883 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.528911 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.528930 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.528954 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.528990 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.529017 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.529050 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.529087 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.529110 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.529137 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.529162 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.529185 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.529214 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.529237 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.529253 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.529269 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.529291 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.529310 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.529325 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.529985 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.530006 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.530019 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.530038 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.530053 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.530067 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.530085 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.530098 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.530113 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.530184 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.530199 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.530219 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.530233 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.530252 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.530267 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.530283 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.530300 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.534747 4745 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535162 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535184 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535196 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535208 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535219 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535230 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535242 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535252 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535264 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535275 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535285 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535296 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535308 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535318 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535328 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535338 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535348 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535358 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535369 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535392 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535404 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535415 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535424 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535435 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535445 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535463 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535474 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535484 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535494 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535504 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535542 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535554 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535564 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535575 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535585 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535595 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535606 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535615 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535626 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535635 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535645 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535664 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535673 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535683 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535692 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535702 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535712 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535726 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535747 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535772 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535788 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535800 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535814 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535827 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535847 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535861 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535874 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535887 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535899 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535911 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535922 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535934 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535947 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535959 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535982 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.535998 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.536011 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.536023 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.536035 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.536046 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.536057 4745 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.536068 4745 reconstruct.go:97] "Volume reconstruction finished" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.536076 4745 reconciler.go:26] "Reconciler: start to sync state" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.538046 4745 manager.go:324] Recovery completed Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.550983 4745 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.552805 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.553546 4745 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.553598 4745 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.553624 4745 kubelet.go:2335] "Starting kubelet main sync loop" Dec 09 11:31:53 crc kubenswrapper[4745]: E1209 11:31:53.553670 4745 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 09 11:31:53 crc kubenswrapper[4745]: W1209 11:31:53.554629 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Dec 09 11:31:53 crc kubenswrapper[4745]: E1209 11:31:53.554700 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.555029 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.555067 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.555082 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.556599 4745 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.556618 4745 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.556634 4745 state_mem.go:36] "Initialized new in-memory state store" Dec 09 11:31:53 crc kubenswrapper[4745]: E1209 11:31:53.611374 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.630672 4745 policy_none.go:49] "None policy: Start" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.632537 4745 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.632562 4745 state_mem.go:35] "Initializing new in-memory state store" Dec 09 11:31:53 crc kubenswrapper[4745]: E1209 11:31:53.654307 4745 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.679345 4745 manager.go:334] "Starting Device Plugin manager" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.679408 4745 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.679420 4745 server.go:79] "Starting device plugin registration server" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.679810 4745 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.679825 4745 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.680094 4745 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.680162 4745 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.680176 4745 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 09 11:31:53 crc kubenswrapper[4745]: E1209 11:31:53.686837 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 09 11:31:53 crc kubenswrapper[4745]: E1209 11:31:53.714352 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" interval="400ms" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.780759 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.781815 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.781886 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.781907 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.781949 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 11:31:53 crc kubenswrapper[4745]: E1209 11:31:53.782550 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.45:6443: connect: connection refused" node="crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.854831 4745 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.854936 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.856114 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.856157 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.856175 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.856337 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.856588 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.856637 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.857251 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.857309 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.857349 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.857521 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.857544 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.857555 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.857605 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.857751 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.857780 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.858623 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.858640 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.858649 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.858683 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.858718 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.858733 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.858878 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.858952 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.858975 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.859730 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.859761 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.859773 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.859952 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.859978 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.859990 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.860088 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.860177 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.860206 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.860763 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.860815 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.860831 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.860988 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.861022 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.861374 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.861417 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.861434 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.861767 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.861807 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.861820 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.938864 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.938903 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.938925 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.938939 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.938953 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.938967 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.938979 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.938992 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.939006 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.939019 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.939031 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.939045 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.939058 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.939071 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.939086 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.983595 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.984818 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.984844 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.984856 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:53 crc kubenswrapper[4745]: I1209 11:31:53.984879 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 11:31:53 crc kubenswrapper[4745]: E1209 11:31:53.985284 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.45:6443: connect: connection refused" node="crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040090 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040154 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040195 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040267 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040296 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040293 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040322 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040341 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040351 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040307 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040308 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040380 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040403 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040414 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040425 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040481 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040482 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040429 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040593 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040624 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040644 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040674 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040693 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.040709 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.041020 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.041053 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.041077 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.041100 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.041119 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.041122 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: E1209 11:31:54.115383 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" interval="800ms" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.186325 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.205564 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: W1209 11:31:54.215938 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-7ce0bdb544454be73db7c29b8d9553d031c36d38a0a4627883f26ff76157eba6 WatchSource:0}: Error finding container 7ce0bdb544454be73db7c29b8d9553d031c36d38a0a4627883f26ff76157eba6: Status 404 returned error can't find the container with id 7ce0bdb544454be73db7c29b8d9553d031c36d38a0a4627883f26ff76157eba6 Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.220658 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: W1209 11:31:54.221622 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-37234aff7dee9046c692ce76f5bd00fe651c640e69f534135f35fdcb37d46187 WatchSource:0}: Error finding container 37234aff7dee9046c692ce76f5bd00fe651c640e69f534135f35fdcb37d46187: Status 404 returned error can't find the container with id 37234aff7dee9046c692ce76f5bd00fe651c640e69f534135f35fdcb37d46187 Dec 09 11:31:54 crc kubenswrapper[4745]: W1209 11:31:54.233580 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-070c03050d59f77cc23d127784cd20d99955ed19637b3af2bd7ee655d4b8ec2c WatchSource:0}: Error finding container 070c03050d59f77cc23d127784cd20d99955ed19637b3af2bd7ee655d4b8ec2c: Status 404 returned error can't find the container with id 070c03050d59f77cc23d127784cd20d99955ed19637b3af2bd7ee655d4b8ec2c Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.236978 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.242676 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 11:31:54 crc kubenswrapper[4745]: W1209 11:31:54.249607 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-2276c7e07e9cc6e63729ea86f6ccd7092018a5400da162f503c76a26407d1132 WatchSource:0}: Error finding container 2276c7e07e9cc6e63729ea86f6ccd7092018a5400da162f503c76a26407d1132: Status 404 returned error can't find the container with id 2276c7e07e9cc6e63729ea86f6ccd7092018a5400da162f503c76a26407d1132 Dec 09 11:31:54 crc kubenswrapper[4745]: W1209 11:31:54.262255 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-1155885e999b80adfa0738b4d3394153596de38a705bfb945c2c054d297d9ba6 WatchSource:0}: Error finding container 1155885e999b80adfa0738b4d3394153596de38a705bfb945c2c054d297d9ba6: Status 404 returned error can't find the container with id 1155885e999b80adfa0738b4d3394153596de38a705bfb945c2c054d297d9ba6 Dec 09 11:31:54 crc kubenswrapper[4745]: W1209 11:31:54.331433 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Dec 09 11:31:54 crc kubenswrapper[4745]: E1209 11:31:54.331541 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.385545 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.387280 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.387317 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.387326 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.387346 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 11:31:54 crc kubenswrapper[4745]: E1209 11:31:54.387703 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.45:6443: connect: connection refused" node="crc" Dec 09 11:31:54 crc kubenswrapper[4745]: W1209 11:31:54.499011 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Dec 09 11:31:54 crc kubenswrapper[4745]: E1209 11:31:54.499091 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.508483 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.560803 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2276c7e07e9cc6e63729ea86f6ccd7092018a5400da162f503c76a26407d1132"} Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.561836 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"070c03050d59f77cc23d127784cd20d99955ed19637b3af2bd7ee655d4b8ec2c"} Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.566985 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"37234aff7dee9046c692ce76f5bd00fe651c640e69f534135f35fdcb37d46187"} Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.570347 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7ce0bdb544454be73db7c29b8d9553d031c36d38a0a4627883f26ff76157eba6"} Dec 09 11:31:54 crc kubenswrapper[4745]: I1209 11:31:54.570993 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1155885e999b80adfa0738b4d3394153596de38a705bfb945c2c054d297d9ba6"} Dec 09 11:31:54 crc kubenswrapper[4745]: W1209 11:31:54.711566 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Dec 09 11:31:54 crc kubenswrapper[4745]: E1209 11:31:54.711687 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Dec 09 11:31:54 crc kubenswrapper[4745]: E1209 11:31:54.916219 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" interval="1.6s" Dec 09 11:31:55 crc kubenswrapper[4745]: W1209 11:31:55.063030 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Dec 09 11:31:55 crc kubenswrapper[4745]: E1209 11:31:55.063091 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.188566 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.189947 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.189986 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.189994 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.190018 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 11:31:55 crc kubenswrapper[4745]: E1209 11:31:55.190496 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.45:6443: connect: connection refused" node="crc" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.508588 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.575639 4745 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297" exitCode=0 Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.575725 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297"} Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.575811 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.577074 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.577141 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.577164 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.577937 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f"} Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.577978 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2"} Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.577998 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23"} Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.578012 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365"} Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.578054 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.579245 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.579303 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.579365 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.579582 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3" exitCode=0 Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.579633 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3"} Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.579624 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.580147 4745 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.580447 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.580469 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.580481 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.581547 4745 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d" exitCode=0 Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.581582 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d"} Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.581552 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:55 crc kubenswrapper[4745]: E1209 11:31:55.581635 4745 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.581675 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.582794 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.582829 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.582842 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.582854 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.582866 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.582865 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.585976 4745 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="3047c6676802023e4411b4f2d2ccfd929edc59c00125b8ad91de589e2a8622dd" exitCode=0 Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.586014 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"3047c6676802023e4411b4f2d2ccfd929edc59c00125b8ad91de589e2a8622dd"} Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.586063 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.588070 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.588104 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:55 crc kubenswrapper[4745]: I1209 11:31:55.588115 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.590694 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d675827d644a54e0ef196be479043e4ed251387e00c279353dc82f3cfc379c50"} Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.590748 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"128879059f618fe71b194187d768c3a7471fed1c5f82d1c5155695c7f1eacd42"} Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.590761 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c96728d2c3cf7756ee0c56aef45fd0a2b64a6baea8e3a92cf8c615225db90a47"} Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.590791 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.591800 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.591846 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.591862 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.593916 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db"} Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.593986 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6"} Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.594018 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128"} Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.594045 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc"} Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.594071 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0"} Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.594258 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.597538 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.597576 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.597593 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.600656 4745 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749" exitCode=0 Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.600758 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749"} Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.600786 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.601559 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.601593 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.601603 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.604721 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.605582 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.606283 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4063a7b7f8deee7c7899e5aa1896e8488f8bf92467369aaf18a6dda6f913e7b8"} Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.607181 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.607239 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.607199 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.607342 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.607357 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.607312 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.790840 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.792131 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.792167 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.792178 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:56 crc kubenswrapper[4745]: I1209 11:31:56.792201 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.556962 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.608694 4745 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911" exitCode=0 Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.608825 4745 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.608855 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.608899 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.608891 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911"} Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.608860 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.609013 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.609377 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.609593 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.610343 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.610369 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.610381 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.610398 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.610412 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.610425 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.610433 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.610437 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.610446 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.610343 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.610474 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.610484 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.610597 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.610660 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:57 crc kubenswrapper[4745]: I1209 11:31:57.610681 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.616370 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809"} Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.616420 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755"} Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.616427 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.616476 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.616432 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d"} Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.616591 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd"} Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.616633 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6"} Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.618863 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.618920 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.618937 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.621881 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.621929 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.621939 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.686553 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.738577 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.738725 4745 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.738764 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.739874 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.739945 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:58 crc kubenswrapper[4745]: I1209 11:31:58.739972 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:59 crc kubenswrapper[4745]: I1209 11:31:59.223923 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:31:59 crc kubenswrapper[4745]: I1209 11:31:59.404802 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:31:59 crc kubenswrapper[4745]: I1209 11:31:59.618273 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:59 crc kubenswrapper[4745]: I1209 11:31:59.618318 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:31:59 crc kubenswrapper[4745]: I1209 11:31:59.619253 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:59 crc kubenswrapper[4745]: I1209 11:31:59.619285 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:59 crc kubenswrapper[4745]: I1209 11:31:59.619296 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:59 crc kubenswrapper[4745]: I1209 11:31:59.619688 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:31:59 crc kubenswrapper[4745]: I1209 11:31:59.619757 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:31:59 crc kubenswrapper[4745]: I1209 11:31:59.619779 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:31:59 crc kubenswrapper[4745]: I1209 11:31:59.939830 4745 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.111277 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.111451 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.112467 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.112499 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.112532 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.120185 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.621284 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.621290 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.621290 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.622741 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.622768 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.622777 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.622964 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.623007 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.623027 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.623213 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.623259 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:00 crc kubenswrapper[4745]: I1209 11:32:00.623276 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:01 crc kubenswrapper[4745]: I1209 11:32:01.462475 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:32:01 crc kubenswrapper[4745]: I1209 11:32:01.623686 4745 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 11:32:01 crc kubenswrapper[4745]: I1209 11:32:01.623761 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:32:01 crc kubenswrapper[4745]: I1209 11:32:01.624654 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:01 crc kubenswrapper[4745]: I1209 11:32:01.624688 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:01 crc kubenswrapper[4745]: I1209 11:32:01.624696 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:03 crc kubenswrapper[4745]: I1209 11:32:03.131386 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:32:03 crc kubenswrapper[4745]: I1209 11:32:03.131698 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:32:03 crc kubenswrapper[4745]: I1209 11:32:03.133157 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:03 crc kubenswrapper[4745]: I1209 11:32:03.133203 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:03 crc kubenswrapper[4745]: I1209 11:32:03.133218 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:03 crc kubenswrapper[4745]: E1209 11:32:03.686961 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 09 11:32:04 crc kubenswrapper[4745]: I1209 11:32:04.161066 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 09 11:32:04 crc kubenswrapper[4745]: I1209 11:32:04.161255 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:32:04 crc kubenswrapper[4745]: I1209 11:32:04.162453 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:04 crc kubenswrapper[4745]: I1209 11:32:04.162529 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:04 crc kubenswrapper[4745]: I1209 11:32:04.162547 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:04 crc kubenswrapper[4745]: I1209 11:32:04.463318 4745 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 09 11:32:04 crc kubenswrapper[4745]: I1209 11:32:04.463431 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:32:06 crc kubenswrapper[4745]: I1209 11:32:06.509109 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 09 11:32:06 crc kubenswrapper[4745]: E1209 11:32:06.517641 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 09 11:32:06 crc kubenswrapper[4745]: W1209 11:32:06.782490 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 09 11:32:06 crc kubenswrapper[4745]: I1209 11:32:06.782653 4745 trace.go:236] Trace[742311752]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 11:31:56.780) (total time: 10001ms): Dec 09 11:32:06 crc kubenswrapper[4745]: Trace[742311752]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:32:06.782) Dec 09 11:32:06 crc kubenswrapper[4745]: Trace[742311752]: [10.001709162s] [10.001709162s] END Dec 09 11:32:06 crc kubenswrapper[4745]: E1209 11:32:06.782688 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 09 11:32:06 crc kubenswrapper[4745]: E1209 11:32:06.793281 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 09 11:32:07 crc kubenswrapper[4745]: W1209 11:32:07.036604 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 09 11:32:07 crc kubenswrapper[4745]: I1209 11:32:07.036694 4745 trace.go:236] Trace[365023115]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 11:31:57.034) (total time: 10001ms): Dec 09 11:32:07 crc kubenswrapper[4745]: Trace[365023115]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:32:07.036) Dec 09 11:32:07 crc kubenswrapper[4745]: Trace[365023115]: [10.001811146s] [10.001811146s] END Dec 09 11:32:07 crc kubenswrapper[4745]: E1209 11:32:07.036730 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 09 11:32:07 crc kubenswrapper[4745]: E1209 11:32:07.109217 4745 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.187f88ba7d67a76b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 11:31:53.501763435 +0000 UTC m=+0.326964969,LastTimestamp:2025-12-09 11:31:53.501763435 +0000 UTC m=+0.326964969,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 11:32:07 crc kubenswrapper[4745]: I1209 11:32:07.453072 4745 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 09 11:32:07 crc kubenswrapper[4745]: I1209 11:32:07.453142 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 09 11:32:07 crc kubenswrapper[4745]: I1209 11:32:07.459187 4745 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 09 11:32:07 crc kubenswrapper[4745]: I1209 11:32:07.459256 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 09 11:32:08 crc kubenswrapper[4745]: I1209 11:32:08.744086 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:32:08 crc kubenswrapper[4745]: I1209 11:32:08.744262 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:32:08 crc kubenswrapper[4745]: I1209 11:32:08.745629 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:08 crc kubenswrapper[4745]: I1209 11:32:08.745669 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:08 crc kubenswrapper[4745]: I1209 11:32:08.745678 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:08 crc kubenswrapper[4745]: I1209 11:32:08.749483 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:32:09 crc kubenswrapper[4745]: I1209 11:32:09.639268 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:32:09 crc kubenswrapper[4745]: I1209 11:32:09.640432 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:09 crc kubenswrapper[4745]: I1209 11:32:09.640575 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:09 crc kubenswrapper[4745]: I1209 11:32:09.640604 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:09 crc kubenswrapper[4745]: I1209 11:32:09.994000 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:32:09 crc kubenswrapper[4745]: I1209 11:32:09.996349 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:09 crc kubenswrapper[4745]: I1209 11:32:09.996437 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:09 crc kubenswrapper[4745]: I1209 11:32:09.996573 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:09 crc kubenswrapper[4745]: I1209 11:32:09.996624 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 11:32:10 crc kubenswrapper[4745]: E1209 11:32:10.004580 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.454456 4745 trace.go:236] Trace[716067504]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 11:31:57.698) (total time: 14756ms): Dec 09 11:32:12 crc kubenswrapper[4745]: Trace[716067504]: ---"Objects listed" error: 14756ms (11:32:12.454) Dec 09 11:32:12 crc kubenswrapper[4745]: Trace[716067504]: [14.756213369s] [14.756213369s] END Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.454480 4745 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.455054 4745 trace.go:236] Trace[2064844619]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 11:31:57.675) (total time: 14779ms): Dec 09 11:32:12 crc kubenswrapper[4745]: Trace[2064844619]: ---"Objects listed" error: 14779ms (11:32:12.454) Dec 09 11:32:12 crc kubenswrapper[4745]: Trace[2064844619]: [14.779779892s] [14.779779892s] END Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.455089 4745 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.455436 4745 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.456686 4745 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.508055 4745 apiserver.go:52] "Watching apiserver" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.511042 4745 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.511281 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.511582 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.511663 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.511756 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.511850 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.511771 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.511931 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.511937 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.511978 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.512233 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.513616 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.513722 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.513923 4745 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.513978 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.514548 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.514566 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.514716 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.514727 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.515206 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.516336 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.544067 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.554819 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556062 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556103 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556128 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556153 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556174 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556190 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556205 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556225 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556246 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556270 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556285 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556301 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556320 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556336 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556351 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556367 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556382 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556400 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556419 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556435 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556450 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556469 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556485 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556520 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556536 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556551 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556552 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556569 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556663 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556677 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556688 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556754 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556778 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556799 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556819 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556836 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556854 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556871 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556901 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556918 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556938 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556957 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556978 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556997 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557014 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557034 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557052 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557070 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557088 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557104 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557120 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557136 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557152 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557499 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557547 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557565 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557581 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557600 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557615 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557633 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557652 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557668 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557684 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557701 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557723 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557748 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557772 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557796 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557816 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557832 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557847 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557866 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557891 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557911 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557931 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557949 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557969 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557989 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558015 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558036 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558053 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558070 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558088 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558105 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558122 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558139 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558158 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558176 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558193 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558214 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558232 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558249 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558267 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558282 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558298 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558318 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558333 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558351 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558368 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558384 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558399 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558481 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558501 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558535 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558554 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558574 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558590 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558606 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558622 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558637 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558654 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558670 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558687 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558705 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558722 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558739 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558765 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558780 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558799 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558815 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558832 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558857 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558881 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558907 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558929 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558947 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558966 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558983 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559000 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559017 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559036 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559055 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559071 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559089 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559105 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559121 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559137 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559167 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559184 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559201 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559223 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559239 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559256 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559271 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559287 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559309 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559329 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559347 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559363 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559380 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559395 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559411 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559429 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559447 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559465 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559483 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559499 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559531 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559547 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559564 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559580 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559598 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559616 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559632 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559650 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559667 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559684 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559702 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559720 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559740 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559757 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559774 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559794 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559811 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559829 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559847 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559865 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559882 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559909 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559930 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559948 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559970 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559997 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560024 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560048 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560076 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560095 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560111 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560128 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560144 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560162 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560181 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560199 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560217 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560234 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560250 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560266 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560302 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560323 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560343 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560362 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560406 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560430 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560448 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560468 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560486 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560505 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560540 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560558 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560575 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560594 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560642 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560654 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.571242 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.571749 4745 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.575841 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556825 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556868 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.556879 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557161 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557248 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557337 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557341 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557442 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557591 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.576945 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557834 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557868 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557500 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557902 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.557972 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558112 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558142 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558141 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558148 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558182 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558191 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558666 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558845 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558857 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558934 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.558948 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559005 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559051 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559096 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559174 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559244 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559251 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559286 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559399 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559425 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559462 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559612 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559671 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559786 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.577758 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559815 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559851 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559922 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.559958 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560013 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560022 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560072 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.560451 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.561250 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.561578 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.561602 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.561731 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.561749 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.561751 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.561888 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.562023 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.562184 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.562669 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.562634 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.562830 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.562878 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.563012 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.563223 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.578032 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.563344 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.563528 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.563584 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.563717 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.563763 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.563922 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.563983 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.564021 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.564226 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.564243 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.564283 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.564307 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.564300 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.564307 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.564755 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.565138 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.565204 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.565229 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.565300 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566113 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566223 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566358 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566436 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566433 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566451 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566434 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566526 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566607 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566629 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566641 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566763 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566807 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566888 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566915 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566960 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.566972 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567006 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567210 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567338 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567369 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567445 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567452 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567489 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567501 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567690 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567603 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567797 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567834 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567867 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567891 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567919 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567957 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.567997 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.568413 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.568534 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.568565 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.568632 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.568658 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.568809 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.568829 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.569090 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.569121 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.569255 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.569458 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.569669 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.569698 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.570018 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.570037 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.570083 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.570166 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.570460 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.570475 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.570620 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.570707 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.570838 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.570900 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.570925 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.571141 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.571454 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.571581 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.571720 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.571909 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.572129 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.572370 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.572353 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.572579 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.572597 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.572855 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.572932 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.573085 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:32:13.073065439 +0000 UTC m=+19.898266973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.578146 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.581056 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.573187 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.573199 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.573577 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.573659 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.573782 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.573918 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.573952 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.574461 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.574711 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.574731 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.574749 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.574806 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.575125 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.575130 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.575142 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.575050 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.575364 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.575436 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.575472 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.575937 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.576379 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.576389 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.576529 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.577090 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.577287 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.577461 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.581432 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:13.081401659 +0000 UTC m=+19.906603173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.581483 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:13.081475761 +0000 UTC m=+19.906677285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.581985 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.584939 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.585098 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.585420 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.591283 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.591691 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.592035 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.592187 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:13.092164423 +0000 UTC m=+19.917365947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.592727 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.594041 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.594372 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.594884 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.594904 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.594914 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.594953 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:13.094938546 +0000 UTC m=+19.920140060 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.595062 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.595935 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.597832 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.598033 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.598302 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.598297 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.598581 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.598592 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.600192 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.601772 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.601837 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.602267 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.605436 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.608606 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.610731 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.614640 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.617799 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.620774 4745 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.661950 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662048 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662122 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662132 4745 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662140 4745 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662149 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662157 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662166 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662174 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662182 4745 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662190 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662199 4745 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662209 4745 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662219 4745 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662228 4745 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662254 4745 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662268 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662278 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662288 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662299 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662310 4745 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662455 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662601 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662880 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662938 4745 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662948 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.662956 4745 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663009 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663166 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663182 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663248 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663268 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663300 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663322 4745 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663332 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663340 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663349 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663358 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663366 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663374 4745 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663383 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663391 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663399 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663409 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663418 4745 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663426 4745 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663434 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663442 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663450 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663458 4745 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663466 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663474 4745 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663483 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663491 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663500 4745 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663528 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663540 4745 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663552 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663562 4745 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663572 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663580 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663590 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663598 4745 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663605 4745 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663613 4745 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663623 4745 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663631 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663640 4745 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663649 4745 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663657 4745 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663665 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663674 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663682 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663690 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663698 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663706 4745 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663714 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663722 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663729 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663737 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663746 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663754 4745 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663762 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663770 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663778 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663786 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663797 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663806 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663813 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663821 4745 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663830 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663838 4745 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663848 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663856 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663865 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663874 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663882 4745 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663890 4745 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663898 4745 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663907 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663915 4745 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663925 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663934 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663942 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663950 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663967 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663976 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663986 4745 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.663997 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664008 4745 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664019 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664030 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664039 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664047 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664056 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664067 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664078 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664088 4745 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664099 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664109 4745 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664119 4745 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664140 4745 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664151 4745 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664162 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664172 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664183 4745 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664193 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664204 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664214 4745 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664224 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664236 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664245 4745 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664252 4745 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664261 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664270 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664281 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664292 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664302 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664312 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664325 4745 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664337 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664347 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664358 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664369 4745 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664379 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664389 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664399 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664410 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664420 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664431 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664445 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664455 4745 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664466 4745 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664477 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664488 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664499 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664530 4745 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664543 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664555 4745 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664566 4745 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664576 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664588 4745 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664601 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664657 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664671 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664684 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664701 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664712 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664723 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664735 4745 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664746 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664756 4745 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664766 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664774 4745 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664782 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664790 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664798 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664806 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664814 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664821 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664831 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664842 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664853 4745 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664864 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664876 4745 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664890 4745 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664901 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664913 4745 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664924 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664937 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664950 4745 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664962 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664974 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664986 4745 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.664997 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.665010 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.666695 4745 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44384->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.666736 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44384->192.168.126.11:17697: read: connection reset by peer" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.666835 4745 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44388->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.666849 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44388->192.168.126.11:17697: read: connection reset by peer" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.667152 4745 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.667180 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.667345 4745 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.667367 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.671630 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.675904 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.681788 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.691389 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.699665 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.714546 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.719725 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.735684 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.753043 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.777858 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.797328 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.798550 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.810815 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.820278 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.829746 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.829798 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.840296 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.840296 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:12 crc kubenswrapper[4745]: W1209 11:32:12.841638 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-22182fff6ff2930d9b9c18f6111ef10d2442c1605f4c0dd46fa549b92896398d WatchSource:0}: Error finding container 22182fff6ff2930d9b9c18f6111ef10d2442c1605f4c0dd46fa549b92896398d: Status 404 returned error can't find the container with id 22182fff6ff2930d9b9c18f6111ef10d2442c1605f4c0dd46fa549b92896398d Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.842784 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.845371 4745 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 09 11:32:12 crc kubenswrapper[4745]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Dec 09 11:32:12 crc kubenswrapper[4745]: set -o allexport Dec 09 11:32:12 crc kubenswrapper[4745]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Dec 09 11:32:12 crc kubenswrapper[4745]: source /etc/kubernetes/apiserver-url.env Dec 09 11:32:12 crc kubenswrapper[4745]: else Dec 09 11:32:12 crc kubenswrapper[4745]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Dec 09 11:32:12 crc kubenswrapper[4745]: exit 1 Dec 09 11:32:12 crc kubenswrapper[4745]: fi Dec 09 11:32:12 crc kubenswrapper[4745]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Dec 09 11:32:12 crc kubenswrapper[4745]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 09 11:32:12 crc kubenswrapper[4745]: > logger="UnhandledError" Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.846572 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Dec 09 11:32:12 crc kubenswrapper[4745]: W1209 11:32:12.854338 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-77b4962a017aa7a57e22e1e9726badcc74d3482f26825c22341b70b1563cdda0 WatchSource:0}: Error finding container 77b4962a017aa7a57e22e1e9726badcc74d3482f26825c22341b70b1563cdda0: Status 404 returned error can't find the container with id 77b4962a017aa7a57e22e1e9726badcc74d3482f26825c22341b70b1563cdda0 Dec 09 11:32:12 crc kubenswrapper[4745]: W1209 11:32:12.855591 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-6e7421e0bca38cf080f899c16d22d265cdbf59ebda78559ee809dd66e8cedb73 WatchSource:0}: Error finding container 6e7421e0bca38cf080f899c16d22d265cdbf59ebda78559ee809dd66e8cedb73: Status 404 returned error can't find the container with id 6e7421e0bca38cf080f899c16d22d265cdbf59ebda78559ee809dd66e8cedb73 Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.856489 4745 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 09 11:32:12 crc kubenswrapper[4745]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Dec 09 11:32:12 crc kubenswrapper[4745]: if [[ -f "/env/_master" ]]; then Dec 09 11:32:12 crc kubenswrapper[4745]: set -o allexport Dec 09 11:32:12 crc kubenswrapper[4745]: source "/env/_master" Dec 09 11:32:12 crc kubenswrapper[4745]: set +o allexport Dec 09 11:32:12 crc kubenswrapper[4745]: fi Dec 09 11:32:12 crc kubenswrapper[4745]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Dec 09 11:32:12 crc kubenswrapper[4745]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Dec 09 11:32:12 crc kubenswrapper[4745]: ho_enable="--enable-hybrid-overlay" Dec 09 11:32:12 crc kubenswrapper[4745]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Dec 09 11:32:12 crc kubenswrapper[4745]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Dec 09 11:32:12 crc kubenswrapper[4745]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Dec 09 11:32:12 crc kubenswrapper[4745]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Dec 09 11:32:12 crc kubenswrapper[4745]: --webhook-cert-dir="/etc/webhook-cert" \ Dec 09 11:32:12 crc kubenswrapper[4745]: --webhook-host=127.0.0.1 \ Dec 09 11:32:12 crc kubenswrapper[4745]: --webhook-port=9743 \ Dec 09 11:32:12 crc kubenswrapper[4745]: ${ho_enable} \ Dec 09 11:32:12 crc kubenswrapper[4745]: --enable-interconnect \ Dec 09 11:32:12 crc kubenswrapper[4745]: --disable-approver \ Dec 09 11:32:12 crc kubenswrapper[4745]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Dec 09 11:32:12 crc kubenswrapper[4745]: --wait-for-kubernetes-api=200s \ Dec 09 11:32:12 crc kubenswrapper[4745]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Dec 09 11:32:12 crc kubenswrapper[4745]: --loglevel="${LOGLEVEL}" Dec 09 11:32:12 crc kubenswrapper[4745]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 09 11:32:12 crc kubenswrapper[4745]: > logger="UnhandledError" Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.857721 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.858801 4745 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 09 11:32:12 crc kubenswrapper[4745]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Dec 09 11:32:12 crc kubenswrapper[4745]: if [[ -f "/env/_master" ]]; then Dec 09 11:32:12 crc kubenswrapper[4745]: set -o allexport Dec 09 11:32:12 crc kubenswrapper[4745]: source "/env/_master" Dec 09 11:32:12 crc kubenswrapper[4745]: set +o allexport Dec 09 11:32:12 crc kubenswrapper[4745]: fi Dec 09 11:32:12 crc kubenswrapper[4745]: Dec 09 11:32:12 crc kubenswrapper[4745]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Dec 09 11:32:12 crc kubenswrapper[4745]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Dec 09 11:32:12 crc kubenswrapper[4745]: --disable-webhook \ Dec 09 11:32:12 crc kubenswrapper[4745]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Dec 09 11:32:12 crc kubenswrapper[4745]: --loglevel="${LOGLEVEL}" Dec 09 11:32:12 crc kubenswrapper[4745]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 09 11:32:12 crc kubenswrapper[4745]: > logger="UnhandledError" Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.858880 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Dec 09 11:32:12 crc kubenswrapper[4745]: E1209 11:32:12.860656 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Dec 09 11:32:12 crc kubenswrapper[4745]: I1209 11:32:12.974912 4745 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.052687 4745 csr.go:261] certificate signing request csr-mthk6 is approved, waiting to be issued Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.060644 4745 csr.go:257] certificate signing request csr-mthk6 is issued Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.169933 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.170033 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.170066 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.170089 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.170111 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:13 crc kubenswrapper[4745]: E1209 11:32:13.170148 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:32:14.170116781 +0000 UTC m=+20.995318305 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:32:13 crc kubenswrapper[4745]: E1209 11:32:13.170200 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 11:32:13 crc kubenswrapper[4745]: E1209 11:32:13.170235 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 11:32:13 crc kubenswrapper[4745]: E1209 11:32:13.170258 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 11:32:13 crc kubenswrapper[4745]: E1209 11:32:13.170303 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 11:32:13 crc kubenswrapper[4745]: E1209 11:32:13.170255 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:14.170239754 +0000 UTC m=+20.995441348 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 11:32:13 crc kubenswrapper[4745]: E1209 11:32:13.170315 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:13 crc kubenswrapper[4745]: E1209 11:32:13.170318 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 11:32:13 crc kubenswrapper[4745]: E1209 11:32:13.170342 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:14.170322876 +0000 UTC m=+20.995524470 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 11:32:13 crc kubenswrapper[4745]: E1209 11:32:13.170356 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 11:32:13 crc kubenswrapper[4745]: E1209 11:32:13.170374 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:13 crc kubenswrapper[4745]: E1209 11:32:13.170379 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:14.170364367 +0000 UTC m=+20.995565891 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:13 crc kubenswrapper[4745]: E1209 11:32:13.170442 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:14.170423089 +0000 UTC m=+20.995624623 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.422078 4745 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 09 11:32:13 crc kubenswrapper[4745]: W1209 11:32:13.422242 4745 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 09 11:32:13 crc kubenswrapper[4745]: W1209 11:32:13.422263 4745 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Dec 09 11:32:13 crc kubenswrapper[4745]: W1209 11:32:13.422245 4745 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Dec 09 11:32:13 crc kubenswrapper[4745]: W1209 11:32:13.422289 4745 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Dec 09 11:32:13 crc kubenswrapper[4745]: W1209 11:32:13.422295 4745 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 09 11:32:13 crc kubenswrapper[4745]: W1209 11:32:13.422307 4745 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Dec 09 11:32:13 crc kubenswrapper[4745]: W1209 11:32:13.422303 4745 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Dec 09 11:32:13 crc kubenswrapper[4745]: W1209 11:32:13.422323 4745 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Dec 09 11:32:13 crc kubenswrapper[4745]: W1209 11:32:13.422332 4745 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 09 11:32:13 crc kubenswrapper[4745]: W1209 11:32:13.422300 4745 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Dec 09 11:32:13 crc kubenswrapper[4745]: W1209 11:32:13.422313 4745 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Dec 09 11:32:13 crc kubenswrapper[4745]: W1209 11:32:13.422330 4745 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Dec 09 11:32:13 crc kubenswrapper[4745]: W1209 11:32:13.422333 4745 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.557703 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.558326 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.559308 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.560072 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.560728 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.561288 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.561867 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.562368 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.562960 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.563531 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.564041 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.564704 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.565155 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.565747 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.566204 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.566371 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.567007 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.567683 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.568135 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.571662 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.572233 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.573353 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.574331 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.574858 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.575602 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.576346 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.577120 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.579276 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.578394 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.579963 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.581092 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.581660 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.582770 4745 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.582893 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.584923 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.585974 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.586816 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.588883 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.589538 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.589727 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.590820 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.591733 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.593225 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.593799 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.595200 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.596735 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.597449 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.598015 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.599112 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.601393 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.601798 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.602762 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.603426 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.604472 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.604956 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.606056 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.606821 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.607344 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.611114 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.633378 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.645464 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.649303 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"77b4962a017aa7a57e22e1e9726badcc74d3482f26825c22341b70b1563cdda0"} Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.650625 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"22182fff6ff2930d9b9c18f6111ef10d2442c1605f4c0dd46fa549b92896398d"} Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.652208 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.654298 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db" exitCode=255 Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.654360 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db"} Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.655351 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6e7421e0bca38cf080f899c16d22d265cdbf59ebda78559ee809dd66e8cedb73"} Dec 09 11:32:13 crc kubenswrapper[4745]: E1209 11:32:13.660548 4745 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.660627 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.669122 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.679229 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.685108 4745 scope.go:117] "RemoveContainer" containerID="580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.685743 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.689791 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.700491 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.711093 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.723438 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.738756 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.753400 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.768412 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.781871 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.793217 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.823966 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.824607 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-2nxln"] Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.825214 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.826366 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2nrjq"] Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.826554 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-r6gmj"] Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.826740 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.826983 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2nrjq" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.831459 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.831544 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.831480 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.832042 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.832678 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.833060 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.835728 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.835965 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.836155 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.836256 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.838038 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-bc7sx"] Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.838390 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.843305 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.843343 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.848492 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.848735 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.848905 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.868100 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.898614 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.917246 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.930664 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.937386 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.949078 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.958642 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.968605 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.976905 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a087421f-143c-4f67-b2a3-38d27e5805ec-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.977045 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-var-lib-kubelet\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.977150 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-etc-kubernetes\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.977249 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-multus-socket-dir-parent\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.977361 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-var-lib-cni-bin\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.977472 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-var-lib-cni-multus\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.977609 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nstbh\" (UniqueName: \"kubernetes.io/projected/1002b34d-f671-4b20-bf4f-492ce3295cc4-kube-api-access-nstbh\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.977735 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqgt8\" (UniqueName: \"kubernetes.io/projected/a9dc9202-9b7e-4a17-a80f-db9338f17cd7-kube-api-access-tqgt8\") pod \"machine-config-daemon-bc7sx\" (UID: \"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\") " pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.977834 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-multus-cni-dir\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.977952 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/34be8f45-03bf-4f76-96d6-b1b8c66f41b1-hosts-file\") pod \"node-resolver-2nrjq\" (UID: \"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\") " pod="openshift-dns/node-resolver-2nrjq" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978065 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a087421f-143c-4f67-b2a3-38d27e5805ec-system-cni-dir\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978162 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1002b34d-f671-4b20-bf4f-492ce3295cc4-multus-daemon-config\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978351 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a9dc9202-9b7e-4a17-a80f-db9338f17cd7-rootfs\") pod \"machine-config-daemon-bc7sx\" (UID: \"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\") " pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978432 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9dc9202-9b7e-4a17-a80f-db9338f17cd7-proxy-tls\") pod \"machine-config-daemon-bc7sx\" (UID: \"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\") " pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978455 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-system-cni-dir\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978474 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-run-k8s-cni-cncf-io\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978497 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1002b34d-f671-4b20-bf4f-492ce3295cc4-cni-binary-copy\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978532 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-hostroot\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978554 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-multus-conf-dir\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978578 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a087421f-143c-4f67-b2a3-38d27e5805ec-os-release\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978602 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a9dc9202-9b7e-4a17-a80f-db9338f17cd7-mcd-auth-proxy-config\") pod \"machine-config-daemon-bc7sx\" (UID: \"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\") " pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978622 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-cnibin\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978549 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978683 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a087421f-143c-4f67-b2a3-38d27e5805ec-cni-binary-copy\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978711 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpmvf\" (UniqueName: \"kubernetes.io/projected/a087421f-143c-4f67-b2a3-38d27e5805ec-kube-api-access-jpmvf\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978736 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mbq6\" (UniqueName: \"kubernetes.io/projected/34be8f45-03bf-4f76-96d6-b1b8c66f41b1-kube-api-access-2mbq6\") pod \"node-resolver-2nrjq\" (UID: \"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\") " pod="openshift-dns/node-resolver-2nrjq" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978777 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a087421f-143c-4f67-b2a3-38d27e5805ec-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978802 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-run-multus-certs\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978839 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a087421f-143c-4f67-b2a3-38d27e5805ec-cnibin\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978873 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-os-release\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.978898 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-run-netns\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:13 crc kubenswrapper[4745]: I1209 11:32:13.990062 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.000690 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.015830 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.024636 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.035111 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.062202 4745 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-09 11:27:13 +0000 UTC, rotation deadline is 2026-09-11 04:45:39.400068625 +0000 UTC Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.062347 4745 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6617h13m25.337732585s for next certificate rotation Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.079711 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a087421f-143c-4f67-b2a3-38d27e5805ec-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.079759 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-var-lib-kubelet\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.079783 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-etc-kubernetes\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.079802 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-multus-socket-dir-parent\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.079951 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-var-lib-cni-bin\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.079969 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-var-lib-cni-multus\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.079965 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-etc-kubernetes\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080050 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-multus-socket-dir-parent\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080051 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-var-lib-cni-bin\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080099 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-var-lib-kubelet\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080107 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nstbh\" (UniqueName: \"kubernetes.io/projected/1002b34d-f671-4b20-bf4f-492ce3295cc4-kube-api-access-nstbh\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080187 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-var-lib-cni-multus\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080213 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqgt8\" (UniqueName: \"kubernetes.io/projected/a9dc9202-9b7e-4a17-a80f-db9338f17cd7-kube-api-access-tqgt8\") pod \"machine-config-daemon-bc7sx\" (UID: \"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\") " pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080313 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-multus-cni-dir\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080362 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/34be8f45-03bf-4f76-96d6-b1b8c66f41b1-hosts-file\") pod \"node-resolver-2nrjq\" (UID: \"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\") " pod="openshift-dns/node-resolver-2nrjq" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080394 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a087421f-143c-4f67-b2a3-38d27e5805ec-system-cni-dir\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080417 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1002b34d-f671-4b20-bf4f-492ce3295cc4-multus-daemon-config\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080448 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a9dc9202-9b7e-4a17-a80f-db9338f17cd7-rootfs\") pod \"machine-config-daemon-bc7sx\" (UID: \"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\") " pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080484 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9dc9202-9b7e-4a17-a80f-db9338f17cd7-proxy-tls\") pod \"machine-config-daemon-bc7sx\" (UID: \"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\") " pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080493 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a087421f-143c-4f67-b2a3-38d27e5805ec-system-cni-dir\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080501 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-multus-cni-dir\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080536 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-system-cni-dir\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080548 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/34be8f45-03bf-4f76-96d6-b1b8c66f41b1-hosts-file\") pod \"node-resolver-2nrjq\" (UID: \"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\") " pod="openshift-dns/node-resolver-2nrjq" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080561 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-run-k8s-cni-cncf-io\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080583 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-hostroot\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080595 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-system-cni-dir\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080602 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-multus-conf-dir\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080564 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a9dc9202-9b7e-4a17-a80f-db9338f17cd7-rootfs\") pod \"machine-config-daemon-bc7sx\" (UID: \"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\") " pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080637 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-multus-conf-dir\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080625 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a087421f-143c-4f67-b2a3-38d27e5805ec-os-release\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080477 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a087421f-143c-4f67-b2a3-38d27e5805ec-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080668 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a9dc9202-9b7e-4a17-a80f-db9338f17cd7-mcd-auth-proxy-config\") pod \"machine-config-daemon-bc7sx\" (UID: \"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\") " pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080620 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-run-k8s-cni-cncf-io\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080683 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-cnibin\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080713 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-cnibin\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080621 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-hostroot\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080750 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1002b34d-f671-4b20-bf4f-492ce3295cc4-cni-binary-copy\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080772 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a087421f-143c-4f67-b2a3-38d27e5805ec-os-release\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080798 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a087421f-143c-4f67-b2a3-38d27e5805ec-cni-binary-copy\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080825 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpmvf\" (UniqueName: \"kubernetes.io/projected/a087421f-143c-4f67-b2a3-38d27e5805ec-kube-api-access-jpmvf\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080847 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mbq6\" (UniqueName: \"kubernetes.io/projected/34be8f45-03bf-4f76-96d6-b1b8c66f41b1-kube-api-access-2mbq6\") pod \"node-resolver-2nrjq\" (UID: \"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\") " pod="openshift-dns/node-resolver-2nrjq" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080870 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a087421f-143c-4f67-b2a3-38d27e5805ec-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080896 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a087421f-143c-4f67-b2a3-38d27e5805ec-cnibin\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080915 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-run-multus-certs\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080936 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-os-release\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.080955 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-run-netns\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.081029 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-run-netns\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.081061 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a087421f-143c-4f67-b2a3-38d27e5805ec-cnibin\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.081087 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-host-run-multus-certs\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.081157 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1002b34d-f671-4b20-bf4f-492ce3295cc4-os-release\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.082945 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a087421f-143c-4f67-b2a3-38d27e5805ec-cni-binary-copy\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.083691 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a087421f-143c-4f67-b2a3-38d27e5805ec-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.084072 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a9dc9202-9b7e-4a17-a80f-db9338f17cd7-mcd-auth-proxy-config\") pod \"machine-config-daemon-bc7sx\" (UID: \"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\") " pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.084348 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1002b34d-f671-4b20-bf4f-492ce3295cc4-cni-binary-copy\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.084768 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1002b34d-f671-4b20-bf4f-492ce3295cc4-multus-daemon-config\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.088385 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9dc9202-9b7e-4a17-a80f-db9338f17cd7-proxy-tls\") pod \"machine-config-daemon-bc7sx\" (UID: \"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\") " pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.104323 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpmvf\" (UniqueName: \"kubernetes.io/projected/a087421f-143c-4f67-b2a3-38d27e5805ec-kube-api-access-jpmvf\") pod \"multus-additional-cni-plugins-2nxln\" (UID: \"a087421f-143c-4f67-b2a3-38d27e5805ec\") " pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.112717 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mbq6\" (UniqueName: \"kubernetes.io/projected/34be8f45-03bf-4f76-96d6-b1b8c66f41b1-kube-api-access-2mbq6\") pod \"node-resolver-2nrjq\" (UID: \"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\") " pod="openshift-dns/node-resolver-2nrjq" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.115995 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nstbh\" (UniqueName: \"kubernetes.io/projected/1002b34d-f671-4b20-bf4f-492ce3295cc4-kube-api-access-nstbh\") pod \"multus-r6gmj\" (UID: \"1002b34d-f671-4b20-bf4f-492ce3295cc4\") " pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.117214 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqgt8\" (UniqueName: \"kubernetes.io/projected/a9dc9202-9b7e-4a17-a80f-db9338f17cd7-kube-api-access-tqgt8\") pod \"machine-config-daemon-bc7sx\" (UID: \"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\") " pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.146216 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2nxln" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.155471 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:32:14 crc kubenswrapper[4745]: W1209 11:32:14.158655 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda087421f_143c_4f67_b2a3_38d27e5805ec.slice/crio-8643db6008b5ee1661711644cc66b7b2646858311bbc383c0576b3a67292595a WatchSource:0}: Error finding container 8643db6008b5ee1661711644cc66b7b2646858311bbc383c0576b3a67292595a: Status 404 returned error can't find the container with id 8643db6008b5ee1661711644cc66b7b2646858311bbc383c0576b3a67292595a Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.171942 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r6gmj" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.181340 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.181444 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.181467 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.181558 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:32:16.181500104 +0000 UTC m=+23.006701628 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.181602 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.181610 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.181636 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.181619 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.181722 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.181733 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.181661 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.181775 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:16.181762021 +0000 UTC m=+23.006963545 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.181714 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.181791 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:16.181785422 +0000 UTC m=+23.006986946 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.181798 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.181803 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:16.181797562 +0000 UTC m=+23.006999086 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.181809 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.181833 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:16.181825343 +0000 UTC m=+23.007026867 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.182880 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2nrjq" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.196844 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.199938 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vwrlh"] Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.200674 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: W1209 11:32:14.206406 4745 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": failed to list *v1.Secret: secrets "ovn-kubernetes-node-dockercfg-pwtwl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.206696 4745 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pwtwl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-node-dockercfg-pwtwl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.206780 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.207001 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.207149 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.207254 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.207578 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.210272 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.215864 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.220196 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 09 11:32:14 crc kubenswrapper[4745]: W1209 11:32:14.231676 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34be8f45_03bf_4f76_96d6_b1b8c66f41b1.slice/crio-b5a56f853b3b29fdddd35190bc82bb9849eb6be64eec8152c044f07381f2959a WatchSource:0}: Error finding container b5a56f853b3b29fdddd35190bc82bb9849eb6be64eec8152c044f07381f2959a: Status 404 returned error can't find the container with id b5a56f853b3b29fdddd35190bc82bb9849eb6be64eec8152c044f07381f2959a Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.235888 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.236637 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.242548 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.255824 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.274743 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.282580 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-ovnkube-script-lib\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.282645 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-ovnkube-config\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.282689 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-env-overrides\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.282709 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-node-log\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.282732 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dwc4\" (UniqueName: \"kubernetes.io/projected/ac484d76-f5da-4880-868d-1e2e5289c025-kube-api-access-6dwc4\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.282769 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-kubelet\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.282792 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-etc-openvswitch\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.282815 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac484d76-f5da-4880-868d-1e2e5289c025-ovn-node-metrics-cert\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.282856 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-openvswitch\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.282893 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-systemd\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.282931 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-var-lib-openvswitch\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.282960 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-systemd-units\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.282981 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-slash\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.283030 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-log-socket\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.283060 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-ovn\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.283125 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-cni-netd\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.283151 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-cni-bin\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.283169 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.283186 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-run-netns\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.283204 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-run-ovn-kubernetes\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.285233 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.292251 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.296848 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.306765 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.318805 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.330564 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.342410 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.359163 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.376115 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.377101 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387040 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-cni-netd\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387094 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-run-netns\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387125 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-cni-bin\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387153 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387216 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-cni-netd\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387240 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-run-netns\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387209 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387303 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-cni-bin\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387313 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-run-ovn-kubernetes\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387367 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-ovnkube-script-lib\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387410 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-ovnkube-config\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387417 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-run-ovn-kubernetes\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387440 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-env-overrides\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387470 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-kubelet\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387496 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-node-log\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387547 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dwc4\" (UniqueName: \"kubernetes.io/projected/ac484d76-f5da-4880-868d-1e2e5289c025-kube-api-access-6dwc4\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387586 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-etc-openvswitch\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387611 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac484d76-f5da-4880-868d-1e2e5289c025-ovn-node-metrics-cert\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387865 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-openvswitch\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.387907 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-systemd\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388108 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-systemd-units\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388128 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388155 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-var-lib-openvswitch\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388328 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-slash\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388464 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-ovn\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388561 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-kubelet\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388570 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-log-socket\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388584 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-openvswitch\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388653 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-log-socket\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388134 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-ovnkube-script-lib\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388783 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-node-log\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388797 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-systemd\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388856 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-slash\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388901 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-ovn\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388965 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-var-lib-openvswitch\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388992 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-systemd-units\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.389012 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-etc-openvswitch\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.388862 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-env-overrides\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.389094 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-ovnkube-config\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.399837 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac484d76-f5da-4880-868d-1e2e5289c025-ovn-node-metrics-cert\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.408106 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dwc4\" (UniqueName: \"kubernetes.io/projected/ac484d76-f5da-4880-868d-1e2e5289c025-kube-api-access-6dwc4\") pod \"ovnkube-node-vwrlh\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.409608 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.416150 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.423318 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.445162 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.461764 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.474701 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.486638 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.499723 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.541408 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.554030 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.554055 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.554062 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.554133 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.554189 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:14 crc kubenswrapper[4745]: E1209 11:32:14.554311 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.584311 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.624859 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.660222 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" event={"ID":"a087421f-143c-4f67-b2a3-38d27e5805ec","Type":"ContainerStarted","Data":"57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689"} Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.660279 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" event={"ID":"a087421f-143c-4f67-b2a3-38d27e5805ec","Type":"ContainerStarted","Data":"8643db6008b5ee1661711644cc66b7b2646858311bbc383c0576b3a67292595a"} Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.661065 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.664417 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3"} Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.664502 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952"} Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.666866 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2nrjq" event={"ID":"34be8f45-03bf-4f76-96d6-b1b8c66f41b1","Type":"ContainerStarted","Data":"18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5"} Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.666914 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2nrjq" event={"ID":"34be8f45-03bf-4f76-96d6-b1b8c66f41b1","Type":"ContainerStarted","Data":"b5a56f853b3b29fdddd35190bc82bb9849eb6be64eec8152c044f07381f2959a"} Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.669354 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.671965 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d"} Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.674060 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a"} Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.676007 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r6gmj" event={"ID":"1002b34d-f671-4b20-bf4f-492ce3295cc4","Type":"ContainerStarted","Data":"c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f"} Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.676070 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r6gmj" event={"ID":"1002b34d-f671-4b20-bf4f-492ce3295cc4","Type":"ContainerStarted","Data":"19ec826d61b64140cf6129733dd33fb6dd066dac39a316e2279a3f287abe9b50"} Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.678555 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659"} Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.678609 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186"} Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.678628 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"ca025d889f2a5ab44d263a57057e8c024966451c452d88d6dc5fa22fc201b088"} Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.715721 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:14Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.716094 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.764765 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:14Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.811366 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:14Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.814798 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.854048 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.885797 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:14Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.948772 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:14Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:14 crc kubenswrapper[4745]: I1209 11:32:14.952760 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.021988 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.049800 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.069268 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.106268 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.148838 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.185883 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.221560 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.269502 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.310612 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.343486 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.386243 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.420720 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.516573 4745 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" secret="" err="failed to sync secret cache: timed out waiting for the condition" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.516667 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.682435 4745 generic.go:334] "Generic (PLEG): container finished" podID="a087421f-143c-4f67-b2a3-38d27e5805ec" containerID="57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689" exitCode=0 Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.682554 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" event={"ID":"a087421f-143c-4f67-b2a3-38d27e5805ec","Type":"ContainerDied","Data":"57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689"} Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.684483 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerStarted","Data":"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e"} Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.684538 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerStarted","Data":"eaeb40d855e600fe6ba0f9e1295fea6a8d194601cfcd2688bd440f71cbd0036c"} Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.684949 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.698362 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.714106 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.727525 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.739169 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.750139 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.764781 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.781688 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.801153 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.822915 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.839248 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.853757 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.881430 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.929159 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:15 crc kubenswrapper[4745]: I1209 11:32:15.967115 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.001383 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:15Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.042249 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.091007 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.122442 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.162575 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.202366 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.206724 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.206825 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.206867 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.206927 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.206954 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.206929 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:32:20.206901369 +0000 UTC m=+27.032102893 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.207022 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.207054 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.207136 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.207163 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:20.207126745 +0000 UTC m=+27.032328289 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.207193 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:20.207184287 +0000 UTC m=+27.032385801 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.207166 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.207217 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.207260 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:20.207254409 +0000 UTC m=+27.032455933 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.207138 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.207293 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.207302 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.207329 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:20.207323531 +0000 UTC m=+27.032525055 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.241217 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.281584 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.323261 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.364064 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.404738 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.404836 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.406412 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.406450 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.406460 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.406589 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.463025 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.474741 4745 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.475001 4745 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.477041 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.477087 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.477098 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.477113 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.477124 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:16Z","lastTransitionTime":"2025-12-09T11:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.494891 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.498258 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.498295 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.498305 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.498321 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.498331 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:16Z","lastTransitionTime":"2025-12-09T11:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.509055 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.513098 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.513130 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.513139 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.513156 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.513168 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:16Z","lastTransitionTime":"2025-12-09T11:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.524243 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.524759 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.527771 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.527813 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.527822 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.527836 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.527845 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:16Z","lastTransitionTime":"2025-12-09T11:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.539520 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.543779 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.544000 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.544020 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.544037 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.544060 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:16Z","lastTransitionTime":"2025-12-09T11:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.554360 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.554434 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.554361 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.554470 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.554588 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.554664 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.559692 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: E1209 11:32:16.560007 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.561955 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.562096 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.562173 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.562252 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.562345 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:16Z","lastTransitionTime":"2025-12-09T11:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.562409 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.606381 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.665538 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.665747 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.665861 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.665931 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.665998 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:16Z","lastTransitionTime":"2025-12-09T11:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.688256 4745 generic.go:334] "Generic (PLEG): container finished" podID="ac484d76-f5da-4880-868d-1e2e5289c025" containerID="77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e" exitCode=0 Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.688346 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerDied","Data":"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e"} Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.690523 4745 generic.go:334] "Generic (PLEG): container finished" podID="a087421f-143c-4f67-b2a3-38d27e5805ec" containerID="f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0" exitCode=0 Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.690594 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" event={"ID":"a087421f-143c-4f67-b2a3-38d27e5805ec","Type":"ContainerDied","Data":"f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0"} Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.692519 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965"} Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.712447 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.726988 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.740801 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.760738 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.768956 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.768982 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.768990 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.769004 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.769013 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:16Z","lastTransitionTime":"2025-12-09T11:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.804038 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.843804 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.850205 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-zrwjr"] Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.850606 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zrwjr" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.871110 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.871151 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.871160 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.871175 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.871184 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:16Z","lastTransitionTime":"2025-12-09T11:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.874179 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.893716 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.913266 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea-serviceca\") pod \"node-ca-zrwjr\" (UID: \"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\") " pod="openshift-image-registry/node-ca-zrwjr" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.913304 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5psll\" (UniqueName: \"kubernetes.io/projected/8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea-kube-api-access-5psll\") pod \"node-ca-zrwjr\" (UID: \"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\") " pod="openshift-image-registry/node-ca-zrwjr" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.913335 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea-host\") pod \"node-ca-zrwjr\" (UID: \"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\") " pod="openshift-image-registry/node-ca-zrwjr" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.914373 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.933874 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.960222 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:16Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.973533 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.973571 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.973582 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.973607 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:16 crc kubenswrapper[4745]: I1209 11:32:16.973619 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:16Z","lastTransitionTime":"2025-12-09T11:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.003847 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.014619 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea-serviceca\") pod \"node-ca-zrwjr\" (UID: \"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\") " pod="openshift-image-registry/node-ca-zrwjr" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.014661 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5psll\" (UniqueName: \"kubernetes.io/projected/8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea-kube-api-access-5psll\") pod \"node-ca-zrwjr\" (UID: \"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\") " pod="openshift-image-registry/node-ca-zrwjr" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.014686 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea-host\") pod \"node-ca-zrwjr\" (UID: \"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\") " pod="openshift-image-registry/node-ca-zrwjr" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.014758 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea-host\") pod \"node-ca-zrwjr\" (UID: \"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\") " pod="openshift-image-registry/node-ca-zrwjr" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.015610 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea-serviceca\") pod \"node-ca-zrwjr\" (UID: \"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\") " pod="openshift-image-registry/node-ca-zrwjr" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.052241 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5psll\" (UniqueName: \"kubernetes.io/projected/8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea-kube-api-access-5psll\") pod \"node-ca-zrwjr\" (UID: \"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\") " pod="openshift-image-registry/node-ca-zrwjr" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.065607 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.076126 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.076158 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.076166 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.076180 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.076188 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:17Z","lastTransitionTime":"2025-12-09T11:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.104008 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.144210 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.164698 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zrwjr" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.179152 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.179190 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.179200 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.179214 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.179225 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:17Z","lastTransitionTime":"2025-12-09T11:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.182913 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: W1209 11:32:17.183248 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b9f8d51_22d8_4f69_b81f_9ee4ae5bc8ea.slice/crio-567ea325bd5a64df46b96ba7c4ff11b6730fb240bf84f6e3db00a86ae338286f WatchSource:0}: Error finding container 567ea325bd5a64df46b96ba7c4ff11b6730fb240bf84f6e3db00a86ae338286f: Status 404 returned error can't find the container with id 567ea325bd5a64df46b96ba7c4ff11b6730fb240bf84f6e3db00a86ae338286f Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.221858 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.266629 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.282950 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.283014 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.283025 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.283044 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.283053 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:17Z","lastTransitionTime":"2025-12-09T11:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.305902 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.342282 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.380248 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.385478 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.385675 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.385780 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.385861 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.385950 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:17Z","lastTransitionTime":"2025-12-09T11:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.450524 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.461226 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.488478 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.488504 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.488528 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.488540 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.488549 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:17Z","lastTransitionTime":"2025-12-09T11:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.508533 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.543200 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.584484 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.590804 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.590836 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.590845 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.590858 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.590868 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:17Z","lastTransitionTime":"2025-12-09T11:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.625321 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.662878 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.692542 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.692576 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.692586 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.692600 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.692609 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:17Z","lastTransitionTime":"2025-12-09T11:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.696681 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerStarted","Data":"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.696715 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerStarted","Data":"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.696728 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerStarted","Data":"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.696739 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerStarted","Data":"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.696750 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerStarted","Data":"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.696759 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerStarted","Data":"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.698411 4745 generic.go:334] "Generic (PLEG): container finished" podID="a087421f-143c-4f67-b2a3-38d27e5805ec" containerID="c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747" exitCode=0 Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.698464 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" event={"ID":"a087421f-143c-4f67-b2a3-38d27e5805ec","Type":"ContainerDied","Data":"c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.701492 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zrwjr" event={"ID":"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea","Type":"ContainerStarted","Data":"fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.701596 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zrwjr" event={"ID":"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea","Type":"ContainerStarted","Data":"567ea325bd5a64df46b96ba7c4ff11b6730fb240bf84f6e3db00a86ae338286f"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.711353 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.744481 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.783095 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.796338 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.796374 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.796382 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.796397 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.796406 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:17Z","lastTransitionTime":"2025-12-09T11:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.823848 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.862221 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.893840 4745 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.898635 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.898672 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.898683 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.898697 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.898707 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:17Z","lastTransitionTime":"2025-12-09T11:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.921909 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:17 crc kubenswrapper[4745]: I1209 11:32:17.960504 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.000572 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.000609 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.000621 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.000636 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.000672 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:18Z","lastTransitionTime":"2025-12-09T11:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.000938 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:17Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.041949 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.081306 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.103101 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.103151 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.103161 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.103178 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.103188 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:18Z","lastTransitionTime":"2025-12-09T11:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.122367 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.162949 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.207627 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.208004 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.208013 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.208033 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.208043 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:18Z","lastTransitionTime":"2025-12-09T11:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.208781 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.239712 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.286116 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.310654 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.310690 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.310699 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.310713 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.310724 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:18Z","lastTransitionTime":"2025-12-09T11:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.320951 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.377346 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.403497 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.413648 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.413688 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.413697 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.413712 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.413722 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:18Z","lastTransitionTime":"2025-12-09T11:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.443625 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.481393 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.515918 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.515959 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.515970 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.515986 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.515998 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:18Z","lastTransitionTime":"2025-12-09T11:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.553760 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:18 crc kubenswrapper[4745]: E1209 11:32:18.553863 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.553778 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:18 crc kubenswrapper[4745]: E1209 11:32:18.553921 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.553766 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:18 crc kubenswrapper[4745]: E1209 11:32:18.554232 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.618370 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.618427 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.618437 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.618450 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.618466 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:18Z","lastTransitionTime":"2025-12-09T11:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.706415 4745 generic.go:334] "Generic (PLEG): container finished" podID="a087421f-143c-4f67-b2a3-38d27e5805ec" containerID="6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12" exitCode=0 Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.706469 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" event={"ID":"a087421f-143c-4f67-b2a3-38d27e5805ec","Type":"ContainerDied","Data":"6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12"} Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.721736 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.721780 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.721789 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.721807 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.721816 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:18Z","lastTransitionTime":"2025-12-09T11:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.722579 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.737354 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.750426 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.761252 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.773755 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.786835 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.799537 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.811938 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.824567 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.824603 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.824616 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.824632 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.824646 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:18Z","lastTransitionTime":"2025-12-09T11:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.840563 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.885829 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.921193 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.927020 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.927043 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.927051 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.927063 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.927073 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:18Z","lastTransitionTime":"2025-12-09T11:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:18 crc kubenswrapper[4745]: I1209 11:32:18.994333 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.013644 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.029356 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.029397 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.029416 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.029436 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.029448 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:19Z","lastTransitionTime":"2025-12-09T11:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.042967 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.080344 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.101989 4745 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.132362 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.132408 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.132418 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.132435 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.132448 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:19Z","lastTransitionTime":"2025-12-09T11:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.234732 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.234769 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.234781 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.234796 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.234808 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:19Z","lastTransitionTime":"2025-12-09T11:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.336770 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.336807 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.336816 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.336835 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.336848 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:19Z","lastTransitionTime":"2025-12-09T11:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.438431 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.438477 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.438492 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.438529 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.438549 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:19Z","lastTransitionTime":"2025-12-09T11:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.541162 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.541202 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.541210 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.541226 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.541235 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:19Z","lastTransitionTime":"2025-12-09T11:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.643856 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.643930 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.643938 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.643950 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.643959 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:19Z","lastTransitionTime":"2025-12-09T11:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.712575 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerStarted","Data":"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f"} Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.715389 4745 generic.go:334] "Generic (PLEG): container finished" podID="a087421f-143c-4f67-b2a3-38d27e5805ec" containerID="24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea" exitCode=0 Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.715432 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" event={"ID":"a087421f-143c-4f67-b2a3-38d27e5805ec","Type":"ContainerDied","Data":"24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea"} Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.730671 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.742689 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.747544 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.747574 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.747582 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.747598 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.747607 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:19Z","lastTransitionTime":"2025-12-09T11:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.757173 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.770748 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.783678 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.795603 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.813070 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.823335 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.843795 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.853687 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.853721 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.853730 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.853742 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.853751 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:19Z","lastTransitionTime":"2025-12-09T11:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.859026 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.872047 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.884046 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.896116 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.907268 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.918168 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:19Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.956488 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.956538 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.956548 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.956563 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:19 crc kubenswrapper[4745]: I1209 11:32:19.956574 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:19Z","lastTransitionTime":"2025-12-09T11:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.058447 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.058489 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.058501 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.058535 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.058547 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:20Z","lastTransitionTime":"2025-12-09T11:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.161237 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.161272 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.161282 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.161297 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.161308 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:20Z","lastTransitionTime":"2025-12-09T11:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.244665 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.244885 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.244909 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.244929 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.244964 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:20 crc kubenswrapper[4745]: E1209 11:32:20.245022 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 11:32:20 crc kubenswrapper[4745]: E1209 11:32:20.245065 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:28.245052967 +0000 UTC m=+35.070254491 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 11:32:20 crc kubenswrapper[4745]: E1209 11:32:20.245374 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:32:28.245364625 +0000 UTC m=+35.070566149 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:32:20 crc kubenswrapper[4745]: E1209 11:32:20.245431 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 11:32:20 crc kubenswrapper[4745]: E1209 11:32:20.245453 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:28.245447278 +0000 UTC m=+35.070648802 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 11:32:20 crc kubenswrapper[4745]: E1209 11:32:20.245499 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 11:32:20 crc kubenswrapper[4745]: E1209 11:32:20.245533 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 11:32:20 crc kubenswrapper[4745]: E1209 11:32:20.245545 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:20 crc kubenswrapper[4745]: E1209 11:32:20.245567 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:28.245561121 +0000 UTC m=+35.070762645 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:20 crc kubenswrapper[4745]: E1209 11:32:20.245604 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 11:32:20 crc kubenswrapper[4745]: E1209 11:32:20.245612 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 11:32:20 crc kubenswrapper[4745]: E1209 11:32:20.245619 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:20 crc kubenswrapper[4745]: E1209 11:32:20.245637 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:28.245631312 +0000 UTC m=+35.070832836 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.263058 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.263090 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.263101 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.263117 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.263130 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:20Z","lastTransitionTime":"2025-12-09T11:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.278012 4745 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.366581 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.366614 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.366625 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.366639 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.366651 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:20Z","lastTransitionTime":"2025-12-09T11:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.468881 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.468906 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.468914 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.468925 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.468933 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:20Z","lastTransitionTime":"2025-12-09T11:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.554663 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.554694 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.554704 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:20 crc kubenswrapper[4745]: E1209 11:32:20.554833 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:20 crc kubenswrapper[4745]: E1209 11:32:20.554912 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:20 crc kubenswrapper[4745]: E1209 11:32:20.554969 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.571221 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.571316 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.571331 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.571361 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.571374 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:20Z","lastTransitionTime":"2025-12-09T11:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.673535 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.673595 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.673612 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.673635 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.673647 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:20Z","lastTransitionTime":"2025-12-09T11:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.727559 4745 generic.go:334] "Generic (PLEG): container finished" podID="a087421f-143c-4f67-b2a3-38d27e5805ec" containerID="ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f" exitCode=0 Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.727612 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" event={"ID":"a087421f-143c-4f67-b2a3-38d27e5805ec","Type":"ContainerDied","Data":"ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f"} Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.745231 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:20Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.760204 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:20Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.776717 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.776784 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.776819 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.776844 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.776828 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:20Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.776863 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:20Z","lastTransitionTime":"2025-12-09T11:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.805831 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:20Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.819416 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:20Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.831877 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:20Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.843594 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:20Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.856299 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:20Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.867546 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:20Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.878905 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:20Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.880093 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.880125 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.880136 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.880153 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.880166 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:20Z","lastTransitionTime":"2025-12-09T11:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.894478 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:20Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.905967 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:20Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.921854 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:20Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.930847 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:20Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.941317 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:20Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.983015 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.983271 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.983338 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.983405 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:20 crc kubenswrapper[4745]: I1209 11:32:20.983461 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:20Z","lastTransitionTime":"2025-12-09T11:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.085317 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.085347 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.085355 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.085368 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.085377 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:21Z","lastTransitionTime":"2025-12-09T11:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.187587 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.187621 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.187629 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.187642 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.187650 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:21Z","lastTransitionTime":"2025-12-09T11:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.290005 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.290052 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.290066 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.290087 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.290104 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:21Z","lastTransitionTime":"2025-12-09T11:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.392103 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.392136 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.392144 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.392160 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.392170 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:21Z","lastTransitionTime":"2025-12-09T11:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.495027 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.495061 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.495072 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.495089 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.495099 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:21Z","lastTransitionTime":"2025-12-09T11:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.597266 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.597307 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.597317 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.597335 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.597350 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:21Z","lastTransitionTime":"2025-12-09T11:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.699368 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.699396 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.699404 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.699417 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.699425 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:21Z","lastTransitionTime":"2025-12-09T11:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.737394 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" event={"ID":"a087421f-143c-4f67-b2a3-38d27e5805ec","Type":"ContainerStarted","Data":"0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634"} Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.752315 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:21Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.763560 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:21Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.774000 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:21Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.784499 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:21Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.795278 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:21Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.801825 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.801862 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.801870 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.801884 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.801898 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:21Z","lastTransitionTime":"2025-12-09T11:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.806659 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:21Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.817386 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:21Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.829035 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:21Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.836949 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:21Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.852431 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:21Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.861859 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:21Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.878063 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:21Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.889651 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:21Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.901540 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:21Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.903969 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.904003 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.904014 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.904031 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.904043 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:21Z","lastTransitionTime":"2025-12-09T11:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:21 crc kubenswrapper[4745]: I1209 11:32:21.912396 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:21Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.008691 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.008735 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.008751 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.008768 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.008780 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:22Z","lastTransitionTime":"2025-12-09T11:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.111468 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.111563 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.111577 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.111592 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.111602 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:22Z","lastTransitionTime":"2025-12-09T11:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.213476 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.213520 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.213529 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.213541 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.213550 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:22Z","lastTransitionTime":"2025-12-09T11:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.316456 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.316528 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.316586 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.316611 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.316628 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:22Z","lastTransitionTime":"2025-12-09T11:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.418826 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.418866 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.418876 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.418891 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.418903 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:22Z","lastTransitionTime":"2025-12-09T11:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.520633 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.520670 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.520678 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.520694 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.520706 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:22Z","lastTransitionTime":"2025-12-09T11:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.554039 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.554089 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:22 crc kubenswrapper[4745]: E1209 11:32:22.554171 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.554089 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:22 crc kubenswrapper[4745]: E1209 11:32:22.554287 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:22 crc kubenswrapper[4745]: E1209 11:32:22.554300 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.623405 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.623448 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.623461 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.623478 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.623489 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:22Z","lastTransitionTime":"2025-12-09T11:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.725611 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.725645 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.725655 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.725670 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.725680 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:22Z","lastTransitionTime":"2025-12-09T11:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.744724 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerStarted","Data":"8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60"} Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.762101 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:22Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.773927 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:22Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.792090 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:22Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.807545 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:22Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.815961 4745 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.821190 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:22Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.827644 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.827692 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.827704 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.827723 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.827734 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:22Z","lastTransitionTime":"2025-12-09T11:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.836600 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:22Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.850468 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:22Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.863295 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:22Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.874094 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:22Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.886235 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:22Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.903023 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:22Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.912805 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:22Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.923229 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:22Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.929649 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.929682 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.929694 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.929709 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.929721 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:22Z","lastTransitionTime":"2025-12-09T11:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.936060 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:22Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:22 crc kubenswrapper[4745]: I1209 11:32:22.944978 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:22Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.031324 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.031356 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.031365 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.031377 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.031387 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:23Z","lastTransitionTime":"2025-12-09T11:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.134203 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.134254 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.134264 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.134286 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.134300 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:23Z","lastTransitionTime":"2025-12-09T11:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.236596 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.236636 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.236649 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.236668 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.236682 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:23Z","lastTransitionTime":"2025-12-09T11:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.340603 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.340645 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.340658 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.340674 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.340686 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:23Z","lastTransitionTime":"2025-12-09T11:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.444217 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.444257 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.444271 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.444290 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.444306 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:23Z","lastTransitionTime":"2025-12-09T11:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.547977 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.548056 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.548071 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.548088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.548098 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:23Z","lastTransitionTime":"2025-12-09T11:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.569300 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.584146 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.593018 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.615402 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.624811 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.641659 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.650626 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.650671 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.650683 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.650701 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.650714 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:23Z","lastTransitionTime":"2025-12-09T11:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.654588 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.667544 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.678965 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.697970 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.711807 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.723849 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.736609 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.747021 4745 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.747464 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.747527 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.751524 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.753384 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.753411 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.753421 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.753436 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.753446 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:23Z","lastTransitionTime":"2025-12-09T11:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.765170 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.813492 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.814573 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.824851 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.845354 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.855846 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.855889 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.855899 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.855931 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.855943 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:23Z","lastTransitionTime":"2025-12-09T11:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.860065 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.875559 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.891477 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.901186 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.919318 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.927456 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.944856 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.958627 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.958661 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.958669 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.958682 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.958691 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:23Z","lastTransitionTime":"2025-12-09T11:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.959808 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:23 crc kubenswrapper[4745]: I1209 11:32:23.998287 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.018219 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.041455 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.052038 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.061120 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.061157 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.061167 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.061181 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.061193 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:24Z","lastTransitionTime":"2025-12-09T11:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.064266 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.077147 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.088535 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.099802 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.112235 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.124684 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.133836 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.150106 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.159841 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.163458 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.163489 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.163500 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.163531 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.163550 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:24Z","lastTransitionTime":"2025-12-09T11:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.178117 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.190822 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.202388 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.213144 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.224090 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.236193 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.246273 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:24Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.266011 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.266045 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.266053 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.266069 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.266078 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:24Z","lastTransitionTime":"2025-12-09T11:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.368611 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.368654 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.368669 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.368685 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.368696 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:24Z","lastTransitionTime":"2025-12-09T11:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.471424 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.471462 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.471470 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.471483 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.471492 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:24Z","lastTransitionTime":"2025-12-09T11:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.554604 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.554658 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.554616 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:24 crc kubenswrapper[4745]: E1209 11:32:24.554716 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:24 crc kubenswrapper[4745]: E1209 11:32:24.554816 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:24 crc kubenswrapper[4745]: E1209 11:32:24.554874 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.573499 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.573553 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.573561 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.573575 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.573586 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:24Z","lastTransitionTime":"2025-12-09T11:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.676186 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.676237 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.676246 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.676260 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.676269 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:24Z","lastTransitionTime":"2025-12-09T11:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.751379 4745 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.778928 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.778977 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.778987 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.779002 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.779011 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:24Z","lastTransitionTime":"2025-12-09T11:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.882122 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.882151 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.882158 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.882172 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.882182 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:24Z","lastTransitionTime":"2025-12-09T11:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.995955 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.996026 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.996052 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.996082 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:24 crc kubenswrapper[4745]: I1209 11:32:24.996103 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:24Z","lastTransitionTime":"2025-12-09T11:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.098917 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.098955 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.098963 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.098981 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.098990 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:25Z","lastTransitionTime":"2025-12-09T11:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.202243 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.202330 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.202375 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.202405 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.202427 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:25Z","lastTransitionTime":"2025-12-09T11:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.304858 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.304915 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.304932 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.304956 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.304973 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:25Z","lastTransitionTime":"2025-12-09T11:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.407859 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.407914 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.407933 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.407954 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.407970 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:25Z","lastTransitionTime":"2025-12-09T11:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.510551 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.510602 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.510620 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.510641 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.510654 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:25Z","lastTransitionTime":"2025-12-09T11:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.613244 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.613281 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.613290 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.613306 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.613314 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:25Z","lastTransitionTime":"2025-12-09T11:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.715444 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.715528 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.715542 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.715558 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.715570 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:25Z","lastTransitionTime":"2025-12-09T11:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.755367 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovnkube-controller/0.log" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.758340 4745 generic.go:334] "Generic (PLEG): container finished" podID="ac484d76-f5da-4880-868d-1e2e5289c025" containerID="8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60" exitCode=1 Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.758389 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerDied","Data":"8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60"} Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.759312 4745 scope.go:117] "RemoveContainer" containerID="8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.778296 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:25Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.790880 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:25Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.803665 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:25Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.817909 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.817947 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.817962 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.817984 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.818000 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:25Z","lastTransitionTime":"2025-12-09T11:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.830982 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:25Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.843483 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:25Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.854349 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:25Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.864646 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:25Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.876917 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:25Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.888067 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:25Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.899302 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:25Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.912226 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:25Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.919987 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.920014 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.920023 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.920038 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.920050 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:25Z","lastTransitionTime":"2025-12-09T11:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.922836 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:25Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.941970 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:25Z\\\",\\\"message\\\":\\\"dler 8\\\\nI1209 11:32:25.145757 6027 handler.go:208] Removed *v1.Node event handler 2\\\\nI1209 11:32:25.146246 6027 handler.go:208] Removed *v1.Node event handler 7\\\\nI1209 11:32:25.146413 6027 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1209 11:32:25.146449 6027 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 11:32:25.146623 6027 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 11:32:25.146867 6027 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 11:32:25.147160 6027 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 11:32:25.147334 6027 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 11:32:25.147417 6027 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:25Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.952703 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:25Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:25 crc kubenswrapper[4745]: I1209 11:32:25.964171 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:25Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.002886 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd"] Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.003353 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.006438 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.006643 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.017524 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.021702 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.021733 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.021743 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.021759 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.021770 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:26Z","lastTransitionTime":"2025-12-09T11:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.028381 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d694900-c0af-4c93-be86-8b9e4467d152-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cxwgd\" (UID: \"6d694900-c0af-4c93-be86-8b9e4467d152\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.028492 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6d694900-c0af-4c93-be86-8b9e4467d152-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cxwgd\" (UID: \"6d694900-c0af-4c93-be86-8b9e4467d152\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.028534 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6d694900-c0af-4c93-be86-8b9e4467d152-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cxwgd\" (UID: \"6d694900-c0af-4c93-be86-8b9e4467d152\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.028558 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v9zc\" (UniqueName: \"kubernetes.io/projected/6d694900-c0af-4c93-be86-8b9e4467d152-kube-api-access-7v9zc\") pod \"ovnkube-control-plane-749d76644c-cxwgd\" (UID: \"6d694900-c0af-4c93-be86-8b9e4467d152\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.030687 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.038485 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.056010 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:25Z\\\",\\\"message\\\":\\\"dler 8\\\\nI1209 11:32:25.145757 6027 handler.go:208] Removed *v1.Node event handler 2\\\\nI1209 11:32:25.146246 6027 handler.go:208] Removed *v1.Node event handler 7\\\\nI1209 11:32:25.146413 6027 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1209 11:32:25.146449 6027 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 11:32:25.146623 6027 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 11:32:25.146867 6027 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 11:32:25.147160 6027 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 11:32:25.147334 6027 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 11:32:25.147417 6027 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.064314 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.094282 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.108342 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.122643 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.124413 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.124498 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.124528 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.124548 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.124560 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:26Z","lastTransitionTime":"2025-12-09T11:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.129071 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6d694900-c0af-4c93-be86-8b9e4467d152-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cxwgd\" (UID: \"6d694900-c0af-4c93-be86-8b9e4467d152\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.129123 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6d694900-c0af-4c93-be86-8b9e4467d152-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cxwgd\" (UID: \"6d694900-c0af-4c93-be86-8b9e4467d152\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.129156 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v9zc\" (UniqueName: \"kubernetes.io/projected/6d694900-c0af-4c93-be86-8b9e4467d152-kube-api-access-7v9zc\") pod \"ovnkube-control-plane-749d76644c-cxwgd\" (UID: \"6d694900-c0af-4c93-be86-8b9e4467d152\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.129202 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d694900-c0af-4c93-be86-8b9e4467d152-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cxwgd\" (UID: \"6d694900-c0af-4c93-be86-8b9e4467d152\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.130001 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6d694900-c0af-4c93-be86-8b9e4467d152-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cxwgd\" (UID: \"6d694900-c0af-4c93-be86-8b9e4467d152\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.130416 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6d694900-c0af-4c93-be86-8b9e4467d152-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cxwgd\" (UID: \"6d694900-c0af-4c93-be86-8b9e4467d152\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.134002 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d694900-c0af-4c93-be86-8b9e4467d152-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cxwgd\" (UID: \"6d694900-c0af-4c93-be86-8b9e4467d152\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.139634 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.145845 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v9zc\" (UniqueName: \"kubernetes.io/projected/6d694900-c0af-4c93-be86-8b9e4467d152-kube-api-access-7v9zc\") pod \"ovnkube-control-plane-749d76644c-cxwgd\" (UID: \"6d694900-c0af-4c93-be86-8b9e4467d152\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.150997 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.163165 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.173294 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.185382 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.199633 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.210214 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.220469 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.227109 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.227137 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.227145 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.227160 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.227169 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:26Z","lastTransitionTime":"2025-12-09T11:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.313775 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" Dec 09 11:32:26 crc kubenswrapper[4745]: W1209 11:32:26.327025 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d694900_c0af_4c93_be86_8b9e4467d152.slice/crio-551c34f8b04c0fe9dbeb0d88fd68c6ed8ba1dcae1d6e01fae9305e9fa734de5e WatchSource:0}: Error finding container 551c34f8b04c0fe9dbeb0d88fd68c6ed8ba1dcae1d6e01fae9305e9fa734de5e: Status 404 returned error can't find the container with id 551c34f8b04c0fe9dbeb0d88fd68c6ed8ba1dcae1d6e01fae9305e9fa734de5e Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.330354 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.330378 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.330387 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.330400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.330408 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:26Z","lastTransitionTime":"2025-12-09T11:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.432080 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.432442 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.432467 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.432498 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.432574 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:26Z","lastTransitionTime":"2025-12-09T11:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.534405 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.534444 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.534453 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.534471 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.534483 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:26Z","lastTransitionTime":"2025-12-09T11:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.553850 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.553926 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.553862 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:26 crc kubenswrapper[4745]: E1209 11:32:26.553968 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:26 crc kubenswrapper[4745]: E1209 11:32:26.554018 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:26 crc kubenswrapper[4745]: E1209 11:32:26.554176 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.636400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.636431 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.636439 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.636452 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.636461 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:26Z","lastTransitionTime":"2025-12-09T11:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.738880 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.738922 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.738939 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.738961 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.738975 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:26Z","lastTransitionTime":"2025-12-09T11:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.762017 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" event={"ID":"6d694900-c0af-4c93-be86-8b9e4467d152","Type":"ContainerStarted","Data":"551c34f8b04c0fe9dbeb0d88fd68c6ed8ba1dcae1d6e01fae9305e9fa734de5e"} Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.795021 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.795064 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.795079 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.795105 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.795119 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:26Z","lastTransitionTime":"2025-12-09T11:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:26 crc kubenswrapper[4745]: E1209 11:32:26.807345 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.812025 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.812064 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.812076 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.812094 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.812104 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:26Z","lastTransitionTime":"2025-12-09T11:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:26 crc kubenswrapper[4745]: E1209 11:32:26.824319 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.827658 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.827681 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.827691 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.827707 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.827719 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:26Z","lastTransitionTime":"2025-12-09T11:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:26 crc kubenswrapper[4745]: E1209 11:32:26.840252 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.843499 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.843547 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.843556 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.843571 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.843581 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:26Z","lastTransitionTime":"2025-12-09T11:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:26 crc kubenswrapper[4745]: E1209 11:32:26.853534 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.856866 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.856902 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.856911 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.856924 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.856933 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:26Z","lastTransitionTime":"2025-12-09T11:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:26 crc kubenswrapper[4745]: E1209 11:32:26.868525 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:26Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:26 crc kubenswrapper[4745]: E1209 11:32:26.868630 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.870302 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.870333 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.870343 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.870386 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.870400 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:26Z","lastTransitionTime":"2025-12-09T11:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.972202 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.972231 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.972238 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.972250 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:26 crc kubenswrapper[4745]: I1209 11:32:26.972259 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:26Z","lastTransitionTime":"2025-12-09T11:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.074591 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.074631 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.074642 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.074659 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.074669 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:27Z","lastTransitionTime":"2025-12-09T11:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.177448 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.177474 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.177481 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.177493 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.177501 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:27Z","lastTransitionTime":"2025-12-09T11:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.280116 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.280275 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.280351 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.280443 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.280532 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:27Z","lastTransitionTime":"2025-12-09T11:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.382745 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.382775 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.382799 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.382812 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.382821 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:27Z","lastTransitionTime":"2025-12-09T11:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.476084 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-jdv4j"] Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.476556 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:27 crc kubenswrapper[4745]: E1209 11:32:27.476616 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.485463 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.485494 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.485524 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.485541 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.485552 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:27Z","lastTransitionTime":"2025-12-09T11:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.497945 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.513308 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.531554 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.540021 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9gjh\" (UniqueName: \"kubernetes.io/projected/ea6befdd-80ca-42c2-813f-62a5cdff9605-kube-api-access-h9gjh\") pod \"network-metrics-daemon-jdv4j\" (UID: \"ea6befdd-80ca-42c2-813f-62a5cdff9605\") " pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.540082 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs\") pod \"network-metrics-daemon-jdv4j\" (UID: \"ea6befdd-80ca-42c2-813f-62a5cdff9605\") " pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.541473 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.550780 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.564790 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.578723 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.588292 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.588334 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.588344 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.588331 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.588359 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.588474 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:27Z","lastTransitionTime":"2025-12-09T11:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.605785 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:25Z\\\",\\\"message\\\":\\\"dler 8\\\\nI1209 11:32:25.145757 6027 handler.go:208] Removed *v1.Node event handler 2\\\\nI1209 11:32:25.146246 6027 handler.go:208] Removed *v1.Node event handler 7\\\\nI1209 11:32:25.146413 6027 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1209 11:32:25.146449 6027 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 11:32:25.146623 6027 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 11:32:25.146867 6027 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 11:32:25.147160 6027 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 11:32:25.147334 6027 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 11:32:25.147417 6027 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.615107 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.631864 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.640765 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs\") pod \"network-metrics-daemon-jdv4j\" (UID: \"ea6befdd-80ca-42c2-813f-62a5cdff9605\") " pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.640855 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9gjh\" (UniqueName: \"kubernetes.io/projected/ea6befdd-80ca-42c2-813f-62a5cdff9605-kube-api-access-h9gjh\") pod \"network-metrics-daemon-jdv4j\" (UID: \"ea6befdd-80ca-42c2-813f-62a5cdff9605\") " pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:27 crc kubenswrapper[4745]: E1209 11:32:27.640920 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 11:32:27 crc kubenswrapper[4745]: E1209 11:32:27.640985 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs podName:ea6befdd-80ca-42c2-813f-62a5cdff9605 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:28.140970194 +0000 UTC m=+34.966171708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs") pod "network-metrics-daemon-jdv4j" (UID: "ea6befdd-80ca-42c2-813f-62a5cdff9605") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.645114 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.656694 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9gjh\" (UniqueName: \"kubernetes.io/projected/ea6befdd-80ca-42c2-813f-62a5cdff9605-kube-api-access-h9gjh\") pod \"network-metrics-daemon-jdv4j\" (UID: \"ea6befdd-80ca-42c2-813f-62a5cdff9605\") " pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.658180 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.670427 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.681648 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.690031 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.690068 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.690080 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.690096 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.690107 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:27Z","lastTransitionTime":"2025-12-09T11:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.693107 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.704595 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.765768 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovnkube-controller/1.log" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.766249 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovnkube-controller/0.log" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.768840 4745 generic.go:334] "Generic (PLEG): container finished" podID="ac484d76-f5da-4880-868d-1e2e5289c025" containerID="1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c" exitCode=1 Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.768921 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerDied","Data":"1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c"} Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.768964 4745 scope.go:117] "RemoveContainer" containerID="8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.769990 4745 scope.go:117] "RemoveContainer" containerID="1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c" Dec 09 11:32:27 crc kubenswrapper[4745]: E1209 11:32:27.770450 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.770663 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" event={"ID":"6d694900-c0af-4c93-be86-8b9e4467d152","Type":"ContainerStarted","Data":"3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7"} Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.770701 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" event={"ID":"6d694900-c0af-4c93-be86-8b9e4467d152","Type":"ContainerStarted","Data":"ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10"} Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.786943 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.792490 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.792555 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.792567 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.792583 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.792593 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:27Z","lastTransitionTime":"2025-12-09T11:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.797613 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.833737 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:25Z\\\",\\\"message\\\":\\\"dler 8\\\\nI1209 11:32:25.145757 6027 handler.go:208] Removed *v1.Node event handler 2\\\\nI1209 11:32:25.146246 6027 handler.go:208] Removed *v1.Node event handler 7\\\\nI1209 11:32:25.146413 6027 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1209 11:32:25.146449 6027 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 11:32:25.146623 6027 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 11:32:25.146867 6027 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 11:32:25.147160 6027 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 11:32:25.147334 6027 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 11:32:25.147417 6027 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf]\\\\nI1209 11:32:27.492966 6163 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1209 11:32:27.493003 6163 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493036 6163 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493070 6163 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1209 11:32:27.493099 6163 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1209 11:32:27.493126 6163 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493166 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1209 11:32:27.493260 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.846477 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.862927 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.878906 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.893344 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.894630 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.894673 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.894683 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.894703 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.894713 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:27Z","lastTransitionTime":"2025-12-09T11:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.905044 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.924359 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.936401 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.949416 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.963131 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.976144 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.991484 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:27Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.996560 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.996598 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.996611 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.996629 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:27 crc kubenswrapper[4745]: I1209 11:32:27.996641 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:27Z","lastTransitionTime":"2025-12-09T11:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.004009 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.015336 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.028658 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.041794 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.058527 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.067769 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.083890 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eed2ad18b3a8a3a43e88737d06d240b4911cbaf08355291161bf3013e49fa60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:25Z\\\",\\\"message\\\":\\\"dler 8\\\\nI1209 11:32:25.145757 6027 handler.go:208] Removed *v1.Node event handler 2\\\\nI1209 11:32:25.146246 6027 handler.go:208] Removed *v1.Node event handler 7\\\\nI1209 11:32:25.146413 6027 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1209 11:32:25.146449 6027 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 11:32:25.146623 6027 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 11:32:25.146867 6027 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 11:32:25.147160 6027 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 11:32:25.147334 6027 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1209 11:32:25.147417 6027 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf]\\\\nI1209 11:32:27.492966 6163 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1209 11:32:27.493003 6163 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493036 6163 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493070 6163 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1209 11:32:27.493099 6163 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1209 11:32:27.493126 6163 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493166 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1209 11:32:27.493260 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.093757 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.098490 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.098545 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.098555 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.098570 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.098579 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:28Z","lastTransitionTime":"2025-12-09T11:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.111550 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.123632 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.135719 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.147369 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.149930 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs\") pod \"network-metrics-daemon-jdv4j\" (UID: \"ea6befdd-80ca-42c2-813f-62a5cdff9605\") " pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.150081 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.150149 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs podName:ea6befdd-80ca-42c2-813f-62a5cdff9605 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:29.150132048 +0000 UTC m=+35.975333572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs") pod "network-metrics-daemon-jdv4j" (UID: "ea6befdd-80ca-42c2-813f-62a5cdff9605") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.159301 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.173095 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.185864 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.199494 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.200986 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.201057 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.201252 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.203662 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.203678 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:28Z","lastTransitionTime":"2025-12-09T11:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.215765 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.232547 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.245139 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.251371 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.251600 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:32:44.251568454 +0000 UTC m=+51.076769988 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.251662 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.251727 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.251762 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.251796 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.251825 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.251863 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.251888 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.251905 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.251909 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:44.251872082 +0000 UTC m=+51.077073606 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.251945 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:44.251930903 +0000 UTC m=+51.077132607 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.251951 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.251973 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.251983 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.252005 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.252017 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:44.252007865 +0000 UTC m=+51.077209389 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.252059 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:44.252046826 +0000 UTC m=+51.077248370 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.259955 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.306936 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.306965 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.306973 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.306986 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.306996 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:28Z","lastTransitionTime":"2025-12-09T11:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.410763 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.410842 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.410864 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.410895 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.410915 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:28Z","lastTransitionTime":"2025-12-09T11:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.513750 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.513784 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.513793 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.513806 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.513816 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:28Z","lastTransitionTime":"2025-12-09T11:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.554722 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.554789 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.554822 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.554940 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.555349 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.555425 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.616455 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.616544 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.616563 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.616585 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.616602 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:28Z","lastTransitionTime":"2025-12-09T11:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.720015 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.720059 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.720072 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.720091 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.720105 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:28Z","lastTransitionTime":"2025-12-09T11:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.781888 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovnkube-controller/1.log" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.787665 4745 scope.go:117] "RemoveContainer" containerID="1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c" Dec 09 11:32:28 crc kubenswrapper[4745]: E1209 11:32:28.787972 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.805404 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.822858 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.822911 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.822923 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.822945 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.822867 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.822958 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:28Z","lastTransitionTime":"2025-12-09T11:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.838240 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.851304 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.868060 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.885084 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.907929 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.925551 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.925592 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.925604 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.925621 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.925631 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:28Z","lastTransitionTime":"2025-12-09T11:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.926773 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.940968 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.965054 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf]\\\\nI1209 11:32:27.492966 6163 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1209 11:32:27.493003 6163 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493036 6163 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493070 6163 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1209 11:32:27.493099 6163 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1209 11:32:27.493126 6163 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493166 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1209 11:32:27.493260 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.977135 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:28 crc kubenswrapper[4745]: I1209 11:32:28.988937 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:28Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.002739 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.014621 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.026229 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.028027 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.028086 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.028100 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.028117 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.028129 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:29Z","lastTransitionTime":"2025-12-09T11:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.065547 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.090393 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.131813 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.131879 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.131898 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.131928 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.131948 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:29Z","lastTransitionTime":"2025-12-09T11:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.165847 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs\") pod \"network-metrics-daemon-jdv4j\" (UID: \"ea6befdd-80ca-42c2-813f-62a5cdff9605\") " pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:29 crc kubenswrapper[4745]: E1209 11:32:29.166102 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 11:32:29 crc kubenswrapper[4745]: E1209 11:32:29.166209 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs podName:ea6befdd-80ca-42c2-813f-62a5cdff9605 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:31.166179523 +0000 UTC m=+37.991381087 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs") pod "network-metrics-daemon-jdv4j" (UID: "ea6befdd-80ca-42c2-813f-62a5cdff9605") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.231166 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.234592 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.234646 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.234657 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.234677 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.234690 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:29Z","lastTransitionTime":"2025-12-09T11:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.244590 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.264298 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf]\\\\nI1209 11:32:27.492966 6163 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1209 11:32:27.493003 6163 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493036 6163 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493070 6163 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1209 11:32:27.493099 6163 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1209 11:32:27.493126 6163 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493166 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1209 11:32:27.493260 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.281182 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.297071 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.314660 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.338063 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.338453 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.338556 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.338581 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.338616 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.338640 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:29Z","lastTransitionTime":"2025-12-09T11:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.357893 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.382691 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.400230 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.418019 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.432193 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.446941 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.465283 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.465334 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.465345 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.465361 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.465380 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:29Z","lastTransitionTime":"2025-12-09T11:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.467915 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.481501 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.496487 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.511295 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.528775 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:29Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.554457 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:29 crc kubenswrapper[4745]: E1209 11:32:29.554641 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.567399 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.567444 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.567453 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.567466 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.567476 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:29Z","lastTransitionTime":"2025-12-09T11:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.669353 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.669415 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.669428 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.669445 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.669459 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:29Z","lastTransitionTime":"2025-12-09T11:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.772752 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.772799 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.772810 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.772824 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.772835 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:29Z","lastTransitionTime":"2025-12-09T11:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.877218 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.877840 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.877866 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.877901 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.877925 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:29Z","lastTransitionTime":"2025-12-09T11:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.981782 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.981823 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.981835 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.981851 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:29 crc kubenswrapper[4745]: I1209 11:32:29.981860 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:29Z","lastTransitionTime":"2025-12-09T11:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.084885 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.084990 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.085010 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.085041 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.085062 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:30Z","lastTransitionTime":"2025-12-09T11:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.187080 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.187120 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.187141 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.187159 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.187172 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:30Z","lastTransitionTime":"2025-12-09T11:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.290239 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.290298 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.290315 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.290343 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.290362 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:30Z","lastTransitionTime":"2025-12-09T11:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.392668 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.392753 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.392767 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.392787 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.392798 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:30Z","lastTransitionTime":"2025-12-09T11:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.495449 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.495501 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.495533 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.495551 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.495563 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:30Z","lastTransitionTime":"2025-12-09T11:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.554887 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.555037 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:30 crc kubenswrapper[4745]: E1209 11:32:30.555102 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.555115 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:30 crc kubenswrapper[4745]: E1209 11:32:30.555273 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:30 crc kubenswrapper[4745]: E1209 11:32:30.555382 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.598916 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.599440 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.599666 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.599886 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.600063 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:30Z","lastTransitionTime":"2025-12-09T11:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.703745 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.703798 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.703815 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.703834 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.703849 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:30Z","lastTransitionTime":"2025-12-09T11:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.808072 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.808153 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.808175 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.808206 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.808242 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:30Z","lastTransitionTime":"2025-12-09T11:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.912569 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.912645 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.912665 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.912696 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:30 crc kubenswrapper[4745]: I1209 11:32:30.912715 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:30Z","lastTransitionTime":"2025-12-09T11:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.015543 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.015586 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.015597 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.015613 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.015625 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:31Z","lastTransitionTime":"2025-12-09T11:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.118860 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.118902 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.118916 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.118937 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.118951 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:31Z","lastTransitionTime":"2025-12-09T11:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.191080 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs\") pod \"network-metrics-daemon-jdv4j\" (UID: \"ea6befdd-80ca-42c2-813f-62a5cdff9605\") " pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:31 crc kubenswrapper[4745]: E1209 11:32:31.191208 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 11:32:31 crc kubenswrapper[4745]: E1209 11:32:31.191292 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs podName:ea6befdd-80ca-42c2-813f-62a5cdff9605 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:35.191268982 +0000 UTC m=+42.016470516 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs") pod "network-metrics-daemon-jdv4j" (UID: "ea6befdd-80ca-42c2-813f-62a5cdff9605") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.222196 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.222242 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.222259 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.222280 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.222296 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:31Z","lastTransitionTime":"2025-12-09T11:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.324785 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.324836 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.324851 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.324871 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.324887 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:31Z","lastTransitionTime":"2025-12-09T11:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.427138 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.427206 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.427229 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.427257 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.427279 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:31Z","lastTransitionTime":"2025-12-09T11:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.529397 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.529462 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.529484 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.529548 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.529574 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:31Z","lastTransitionTime":"2025-12-09T11:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.554277 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:31 crc kubenswrapper[4745]: E1209 11:32:31.554389 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.632683 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.632760 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.632778 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.632805 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.632823 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:31Z","lastTransitionTime":"2025-12-09T11:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.734970 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.735056 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.735073 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.735094 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.735110 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:31Z","lastTransitionTime":"2025-12-09T11:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.838308 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.838387 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.838406 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.838432 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.838473 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:31Z","lastTransitionTime":"2025-12-09T11:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.941021 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.941057 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.941117 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.941137 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:31 crc kubenswrapper[4745]: I1209 11:32:31.941147 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:31Z","lastTransitionTime":"2025-12-09T11:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.044193 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.044240 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.044251 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.044268 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.044280 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:32Z","lastTransitionTime":"2025-12-09T11:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.147156 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.147190 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.147199 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.147212 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.147221 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:32Z","lastTransitionTime":"2025-12-09T11:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.249710 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.249759 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.249772 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.249791 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.249805 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:32Z","lastTransitionTime":"2025-12-09T11:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.352242 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.352287 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.352315 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.352378 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.352389 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:32Z","lastTransitionTime":"2025-12-09T11:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.454330 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.454385 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.454395 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.454409 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.454418 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:32Z","lastTransitionTime":"2025-12-09T11:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.554829 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.554880 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.554848 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:32 crc kubenswrapper[4745]: E1209 11:32:32.554973 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:32 crc kubenswrapper[4745]: E1209 11:32:32.555062 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:32 crc kubenswrapper[4745]: E1209 11:32:32.555137 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.556027 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.556056 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.556066 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.556083 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.556095 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:32Z","lastTransitionTime":"2025-12-09T11:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.659020 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.659064 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.659076 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.659092 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.659105 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:32Z","lastTransitionTime":"2025-12-09T11:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.761456 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.761536 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.761551 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.761571 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.761584 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:32Z","lastTransitionTime":"2025-12-09T11:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.863681 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.863729 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.863743 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.863760 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.863774 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:32Z","lastTransitionTime":"2025-12-09T11:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.966397 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.966456 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.966477 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.966499 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:32 crc kubenswrapper[4745]: I1209 11:32:32.966543 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:32Z","lastTransitionTime":"2025-12-09T11:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.069115 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.069149 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.069157 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.069170 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.069181 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:33Z","lastTransitionTime":"2025-12-09T11:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.171788 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.171822 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.171832 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.171847 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.171857 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:33Z","lastTransitionTime":"2025-12-09T11:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.274330 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.274382 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.274396 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.274418 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.274439 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:33Z","lastTransitionTime":"2025-12-09T11:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.377386 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.377432 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.377443 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.377461 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.377471 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:33Z","lastTransitionTime":"2025-12-09T11:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.479811 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.479852 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.479863 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.479877 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.479888 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:33Z","lastTransitionTime":"2025-12-09T11:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.553976 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:33 crc kubenswrapper[4745]: E1209 11:32:33.554122 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.566191 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.577032 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.581689 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.581734 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.581743 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.581756 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.581770 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:33Z","lastTransitionTime":"2025-12-09T11:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.588366 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.600316 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.609889 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.619007 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.629923 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.640567 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.649530 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.665952 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf]\\\\nI1209 11:32:27.492966 6163 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1209 11:32:27.493003 6163 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493036 6163 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493070 6163 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1209 11:32:27.493099 6163 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1209 11:32:27.493126 6163 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493166 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1209 11:32:27.493260 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.675467 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.683489 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.683640 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.683663 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.683682 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.683692 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:33Z","lastTransitionTime":"2025-12-09T11:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.687889 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.700633 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.712172 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.723011 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.739968 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.752144 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:33Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.785401 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.785433 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.785441 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.785454 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.785463 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:33Z","lastTransitionTime":"2025-12-09T11:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.888487 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.888539 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.888549 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.888568 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.888577 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:33Z","lastTransitionTime":"2025-12-09T11:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.990935 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.990983 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.990991 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.991006 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:33 crc kubenswrapper[4745]: I1209 11:32:33.991019 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:33Z","lastTransitionTime":"2025-12-09T11:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.093140 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.093214 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.093233 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.093252 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.093267 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:34Z","lastTransitionTime":"2025-12-09T11:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.195848 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.195904 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.195920 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.195943 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.195975 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:34Z","lastTransitionTime":"2025-12-09T11:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.297992 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.298047 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.298059 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.298078 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.298089 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:34Z","lastTransitionTime":"2025-12-09T11:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.400420 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.400465 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.400477 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.400495 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.400542 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:34Z","lastTransitionTime":"2025-12-09T11:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.502685 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.502752 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.502769 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.502794 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.502811 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:34Z","lastTransitionTime":"2025-12-09T11:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.554782 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.554792 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:34 crc kubenswrapper[4745]: E1209 11:32:34.555015 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.554828 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:34 crc kubenswrapper[4745]: E1209 11:32:34.555148 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:34 crc kubenswrapper[4745]: E1209 11:32:34.555331 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.605160 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.605205 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.605214 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.605228 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.605237 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:34Z","lastTransitionTime":"2025-12-09T11:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.707830 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.707863 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.707872 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.707886 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.707894 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:34Z","lastTransitionTime":"2025-12-09T11:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.809591 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.809659 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.809679 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.809700 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.809717 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:34Z","lastTransitionTime":"2025-12-09T11:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.911865 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.911921 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.911933 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.911948 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:34 crc kubenswrapper[4745]: I1209 11:32:34.911960 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:34Z","lastTransitionTime":"2025-12-09T11:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.014448 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.014544 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.014568 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.014598 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.014628 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:35Z","lastTransitionTime":"2025-12-09T11:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.118113 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.118145 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.118153 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.118166 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.118175 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:35Z","lastTransitionTime":"2025-12-09T11:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.220996 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.221049 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.221064 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.221083 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.221098 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:35Z","lastTransitionTime":"2025-12-09T11:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.232890 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs\") pod \"network-metrics-daemon-jdv4j\" (UID: \"ea6befdd-80ca-42c2-813f-62a5cdff9605\") " pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:35 crc kubenswrapper[4745]: E1209 11:32:35.233033 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 11:32:35 crc kubenswrapper[4745]: E1209 11:32:35.233109 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs podName:ea6befdd-80ca-42c2-813f-62a5cdff9605 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:43.233089778 +0000 UTC m=+50.058291302 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs") pod "network-metrics-daemon-jdv4j" (UID: "ea6befdd-80ca-42c2-813f-62a5cdff9605") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.324351 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.324403 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.324420 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.324440 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.324454 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:35Z","lastTransitionTime":"2025-12-09T11:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.426676 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.426723 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.426739 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.426761 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.426771 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:35Z","lastTransitionTime":"2025-12-09T11:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.529107 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.529168 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.529180 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.529200 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.529212 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:35Z","lastTransitionTime":"2025-12-09T11:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.554276 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:35 crc kubenswrapper[4745]: E1209 11:32:35.554419 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.631995 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.632042 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.632059 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.632076 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.632097 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:35Z","lastTransitionTime":"2025-12-09T11:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.735464 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.735559 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.735571 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.735625 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.735638 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:35Z","lastTransitionTime":"2025-12-09T11:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.838193 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.838239 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.838250 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.838268 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.838281 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:35Z","lastTransitionTime":"2025-12-09T11:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.941043 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.941097 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.941106 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.941120 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:35 crc kubenswrapper[4745]: I1209 11:32:35.941128 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:35Z","lastTransitionTime":"2025-12-09T11:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.043354 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.043415 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.043429 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.043448 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.043495 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:36Z","lastTransitionTime":"2025-12-09T11:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.145934 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.145979 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.146020 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.146037 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.146046 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:36Z","lastTransitionTime":"2025-12-09T11:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.248838 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.248891 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.248908 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.248931 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.248950 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:36Z","lastTransitionTime":"2025-12-09T11:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.354112 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.354326 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.354372 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.354411 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.354453 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:36Z","lastTransitionTime":"2025-12-09T11:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.458597 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.458664 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.458683 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.458711 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.458732 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:36Z","lastTransitionTime":"2025-12-09T11:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.554744 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.554765 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:36 crc kubenswrapper[4745]: E1209 11:32:36.554915 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.554756 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:36 crc kubenswrapper[4745]: E1209 11:32:36.555101 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:36 crc kubenswrapper[4745]: E1209 11:32:36.555244 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.561963 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.562016 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.562033 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.562060 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.562077 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:36Z","lastTransitionTime":"2025-12-09T11:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.665101 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.665182 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.665204 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.665233 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.665267 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:36Z","lastTransitionTime":"2025-12-09T11:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.768769 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.768818 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.768829 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.768847 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.768859 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:36Z","lastTransitionTime":"2025-12-09T11:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.871356 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.871417 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.871433 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.871458 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.871475 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:36Z","lastTransitionTime":"2025-12-09T11:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.969605 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.969689 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.969713 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.969743 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.969768 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:36Z","lastTransitionTime":"2025-12-09T11:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:36 crc kubenswrapper[4745]: E1209 11:32:36.987836 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:36Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.991899 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.991963 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.991982 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.992007 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:36 crc kubenswrapper[4745]: I1209 11:32:36.992040 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:36Z","lastTransitionTime":"2025-12-09T11:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:37 crc kubenswrapper[4745]: E1209 11:32:37.011987 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:37Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.016312 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.016370 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.016390 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.016415 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.016433 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:37Z","lastTransitionTime":"2025-12-09T11:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:37 crc kubenswrapper[4745]: E1209 11:32:37.036590 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:37Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.041186 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.041252 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.041265 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.041283 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.041295 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:37Z","lastTransitionTime":"2025-12-09T11:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:37 crc kubenswrapper[4745]: E1209 11:32:37.055855 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:37Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.060871 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.060933 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.060951 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.060976 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.060992 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:37Z","lastTransitionTime":"2025-12-09T11:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:37 crc kubenswrapper[4745]: E1209 11:32:37.080259 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:37Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:37 crc kubenswrapper[4745]: E1209 11:32:37.080410 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.082019 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.082064 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.082073 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.082090 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.082101 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:37Z","lastTransitionTime":"2025-12-09T11:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.185296 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.185361 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.185371 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.185387 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.185397 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:37Z","lastTransitionTime":"2025-12-09T11:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.288369 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.288415 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.288428 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.288448 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.288460 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:37Z","lastTransitionTime":"2025-12-09T11:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.390704 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.390774 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.390787 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.390803 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.390816 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:37Z","lastTransitionTime":"2025-12-09T11:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.493651 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.493713 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.493736 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.493771 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.493794 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:37Z","lastTransitionTime":"2025-12-09T11:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.554170 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:37 crc kubenswrapper[4745]: E1209 11:32:37.554465 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.596864 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.596932 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.596949 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.596972 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.596991 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:37Z","lastTransitionTime":"2025-12-09T11:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.700701 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.700969 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.701088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.701174 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.701260 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:37Z","lastTransitionTime":"2025-12-09T11:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.803469 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.803704 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.803763 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.803848 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.803908 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:37Z","lastTransitionTime":"2025-12-09T11:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.907701 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.907757 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.907768 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.907789 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:37 crc kubenswrapper[4745]: I1209 11:32:37.907801 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:37Z","lastTransitionTime":"2025-12-09T11:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.010645 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.010700 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.010710 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.010731 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.010742 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:38Z","lastTransitionTime":"2025-12-09T11:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.115238 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.115315 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.115335 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.115371 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.115394 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:38Z","lastTransitionTime":"2025-12-09T11:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.218334 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.218394 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.218427 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.218452 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.218470 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:38Z","lastTransitionTime":"2025-12-09T11:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.321704 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.321776 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.321796 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.321828 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.321848 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:38Z","lastTransitionTime":"2025-12-09T11:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.425073 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.425145 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.425164 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.425191 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.425209 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:38Z","lastTransitionTime":"2025-12-09T11:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.528892 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.528978 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.529008 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.529040 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.529059 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:38Z","lastTransitionTime":"2025-12-09T11:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.554833 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.554915 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.554915 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:38 crc kubenswrapper[4745]: E1209 11:32:38.555025 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:38 crc kubenswrapper[4745]: E1209 11:32:38.555248 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:38 crc kubenswrapper[4745]: E1209 11:32:38.555315 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.633239 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.633359 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.633383 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.633419 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.633444 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:38Z","lastTransitionTime":"2025-12-09T11:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.736296 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.736351 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.736361 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.736383 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:38 crc kubenswrapper[4745]: I1209 11:32:38.736395 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:38Z","lastTransitionTime":"2025-12-09T11:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.055321 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.055363 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.055374 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.055396 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.055410 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:39Z","lastTransitionTime":"2025-12-09T11:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.158872 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.158955 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.158979 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.159015 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.159036 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:39Z","lastTransitionTime":"2025-12-09T11:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.262093 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.262178 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.262195 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.262225 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.262243 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:39Z","lastTransitionTime":"2025-12-09T11:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.366589 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.366632 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.366640 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.366656 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.366665 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:39Z","lastTransitionTime":"2025-12-09T11:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.470074 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.470184 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.470214 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.470298 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.470327 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:39Z","lastTransitionTime":"2025-12-09T11:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.554707 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:39 crc kubenswrapper[4745]: E1209 11:32:39.554987 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.572753 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.572817 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.572831 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.572855 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.572870 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:39Z","lastTransitionTime":"2025-12-09T11:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.676840 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.676903 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.676921 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.676962 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.677007 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:39Z","lastTransitionTime":"2025-12-09T11:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.780778 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.780833 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.780848 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.780874 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.780894 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:39Z","lastTransitionTime":"2025-12-09T11:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.883617 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.883699 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.883725 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.883771 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.883808 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:39Z","lastTransitionTime":"2025-12-09T11:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.989195 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.989321 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.989356 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.989401 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:39 crc kubenswrapper[4745]: I1209 11:32:39.989448 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:39Z","lastTransitionTime":"2025-12-09T11:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.092430 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.092491 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.092541 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.092570 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.092591 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:40Z","lastTransitionTime":"2025-12-09T11:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.194848 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.194888 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.194899 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.194914 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.194925 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:40Z","lastTransitionTime":"2025-12-09T11:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.297420 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.297486 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.297504 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.297558 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.297575 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:40Z","lastTransitionTime":"2025-12-09T11:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.400182 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.400228 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.400240 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.400256 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.400268 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:40Z","lastTransitionTime":"2025-12-09T11:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.502654 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.502723 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.502741 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.502769 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.502791 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:40Z","lastTransitionTime":"2025-12-09T11:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.554598 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.554646 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:40 crc kubenswrapper[4745]: E1209 11:32:40.554783 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.554646 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:40 crc kubenswrapper[4745]: E1209 11:32:40.555022 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:40 crc kubenswrapper[4745]: E1209 11:32:40.555140 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.605748 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.605798 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.605813 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.605834 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.605848 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:40Z","lastTransitionTime":"2025-12-09T11:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.709442 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.709565 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.709600 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.709624 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.709640 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:40Z","lastTransitionTime":"2025-12-09T11:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.811809 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.811857 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.811868 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.811886 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.811897 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:40Z","lastTransitionTime":"2025-12-09T11:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.914810 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.914877 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.914900 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.914931 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:40 crc kubenswrapper[4745]: I1209 11:32:40.914954 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:40Z","lastTransitionTime":"2025-12-09T11:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.017674 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.017721 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.017737 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.017760 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.017834 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:41Z","lastTransitionTime":"2025-12-09T11:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.122378 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.122488 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.122551 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.122591 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.122609 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:41Z","lastTransitionTime":"2025-12-09T11:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.225846 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.225891 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.225900 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.225917 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.225929 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:41Z","lastTransitionTime":"2025-12-09T11:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.328624 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.328674 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.328683 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.328696 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.328704 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:41Z","lastTransitionTime":"2025-12-09T11:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.430786 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.430846 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.430863 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.430885 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.430903 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:41Z","lastTransitionTime":"2025-12-09T11:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.533599 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.533733 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.533756 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.533881 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.533911 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:41Z","lastTransitionTime":"2025-12-09T11:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.554997 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:41 crc kubenswrapper[4745]: E1209 11:32:41.555693 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.556206 4745 scope.go:117] "RemoveContainer" containerID="1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.635984 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.636040 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.636055 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.636082 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.636099 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:41Z","lastTransitionTime":"2025-12-09T11:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.738258 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.738311 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.738334 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.738358 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.738372 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:41Z","lastTransitionTime":"2025-12-09T11:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.833837 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovnkube-controller/1.log" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.836617 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerStarted","Data":"bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0"} Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.841099 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.841147 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.841157 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.841177 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.841191 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:41Z","lastTransitionTime":"2025-12-09T11:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.943793 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.943819 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.943827 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.943840 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:41 crc kubenswrapper[4745]: I1209 11:32:41.943849 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:41Z","lastTransitionTime":"2025-12-09T11:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.046032 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.046073 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.046082 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.046097 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.046107 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:42Z","lastTransitionTime":"2025-12-09T11:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.148168 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.148198 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.148206 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.148218 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.148226 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:42Z","lastTransitionTime":"2025-12-09T11:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.250382 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.250716 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.250726 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.250742 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.250754 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:42Z","lastTransitionTime":"2025-12-09T11:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.352947 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.352976 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.352984 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.352997 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.353006 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:42Z","lastTransitionTime":"2025-12-09T11:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.454992 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.455028 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.455036 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.455053 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.455061 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:42Z","lastTransitionTime":"2025-12-09T11:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.554280 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.554335 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.554279 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:42 crc kubenswrapper[4745]: E1209 11:32:42.554442 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:42 crc kubenswrapper[4745]: E1209 11:32:42.554584 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:42 crc kubenswrapper[4745]: E1209 11:32:42.554663 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.557290 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.557356 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.557366 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.557379 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.557390 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:42Z","lastTransitionTime":"2025-12-09T11:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.660190 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.660221 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.660229 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.660242 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.660250 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:42Z","lastTransitionTime":"2025-12-09T11:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.763529 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.763561 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.763570 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.763582 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.763593 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:42Z","lastTransitionTime":"2025-12-09T11:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.839149 4745 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.861258 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:42Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.865754 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.865785 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.865793 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.865807 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.865816 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:42Z","lastTransitionTime":"2025-12-09T11:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.880230 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:42Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.894461 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:42Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.926171 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:42Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.943890 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:42Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.961642 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:42Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.969215 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.969261 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.969271 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.969288 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.969300 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:42Z","lastTransitionTime":"2025-12-09T11:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.980297 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:42Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:42 crc kubenswrapper[4745]: I1209 11:32:42.994771 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:42Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.007265 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.017867 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.028675 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.040294 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.061261 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.072293 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.072353 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.072365 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.072380 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.072391 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:43Z","lastTransitionTime":"2025-12-09T11:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.074480 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.092986 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf]\\\\nI1209 11:32:27.492966 6163 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1209 11:32:27.493003 6163 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493036 6163 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493070 6163 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1209 11:32:27.493099 6163 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1209 11:32:27.493126 6163 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493166 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1209 11:32:27.493260 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.105042 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.118380 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.175000 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.175196 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.175287 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.175379 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.175459 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:43Z","lastTransitionTime":"2025-12-09T11:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.278084 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.278143 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.278161 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.278185 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.278203 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:43Z","lastTransitionTime":"2025-12-09T11:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.318244 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs\") pod \"network-metrics-daemon-jdv4j\" (UID: \"ea6befdd-80ca-42c2-813f-62a5cdff9605\") " pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:43 crc kubenswrapper[4745]: E1209 11:32:43.318470 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 11:32:43 crc kubenswrapper[4745]: E1209 11:32:43.318657 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs podName:ea6befdd-80ca-42c2-813f-62a5cdff9605 nodeName:}" failed. No retries permitted until 2025-12-09 11:32:59.318624198 +0000 UTC m=+66.143825772 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs") pod "network-metrics-daemon-jdv4j" (UID: "ea6befdd-80ca-42c2-813f-62a5cdff9605") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.380597 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.380636 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.380650 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.380667 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.380679 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:43Z","lastTransitionTime":"2025-12-09T11:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.486790 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.486822 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.486830 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.486843 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.486852 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:43Z","lastTransitionTime":"2025-12-09T11:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.553996 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:43 crc kubenswrapper[4745]: E1209 11:32:43.554212 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.572797 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf]\\\\nI1209 11:32:27.492966 6163 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1209 11:32:27.493003 6163 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493036 6163 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493070 6163 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1209 11:32:27.493099 6163 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1209 11:32:27.493126 6163 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493166 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1209 11:32:27.493260 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.585091 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.588650 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.588701 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.588718 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.588745 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.588766 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:43Z","lastTransitionTime":"2025-12-09T11:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.600383 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.608150 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.617313 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.629296 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.642194 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.653122 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.668880 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.679229 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.690057 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.690090 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.690102 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.690119 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.690130 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:43Z","lastTransitionTime":"2025-12-09T11:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.690767 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.703280 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.715127 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.725459 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.735539 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.754861 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.768380 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.780821 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.792534 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.792576 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.792588 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.792607 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.792619 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:43Z","lastTransitionTime":"2025-12-09T11:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.842832 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovnkube-controller/2.log" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.843394 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovnkube-controller/1.log" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.845970 4745 generic.go:334] "Generic (PLEG): container finished" podID="ac484d76-f5da-4880-868d-1e2e5289c025" containerID="bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0" exitCode=1 Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.846020 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerDied","Data":"bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0"} Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.846053 4745 scope.go:117] "RemoveContainer" containerID="1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.846671 4745 scope.go:117] "RemoveContainer" containerID="bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0" Dec 09 11:32:43 crc kubenswrapper[4745]: E1209 11:32:43.846850 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.859750 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.872776 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.892264 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.894554 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.894584 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.894593 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.894607 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.894615 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:43Z","lastTransitionTime":"2025-12-09T11:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.904603 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.915079 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.927643 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.938547 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.948796 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.960636 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.971230 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.983965 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.997208 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.997246 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.997259 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.997278 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.997290 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:43Z","lastTransitionTime":"2025-12-09T11:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:43 crc kubenswrapper[4745]: I1209 11:32:43.997928 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:43Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.019246 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1730348d6204393040e91f78b75da249fae052917ece2b1bd7a21cc67bbd142c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf]\\\\nI1209 11:32:27.492966 6163 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1209 11:32:27.493003 6163 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493036 6163 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493070 6163 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1209 11:32:27.493099 6163 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1209 11:32:27.493126 6163 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1209 11:32:27.493166 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1209 11:32:27.493260 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:43Z\\\",\\\"message\\\":\\\"s for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 11:32:42.431250 6367 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431256 6367 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431073 6367 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 11:32:42.431295 6367 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:44Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.028749 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:44Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.042129 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:44Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.057176 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:44Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.067548 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:44Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.099396 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.099434 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.099444 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.099460 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.099473 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:44Z","lastTransitionTime":"2025-12-09T11:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.201569 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.201626 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.201640 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.201661 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.201678 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:44Z","lastTransitionTime":"2025-12-09T11:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.304768 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.304849 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.304864 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.304883 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.304892 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:44Z","lastTransitionTime":"2025-12-09T11:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.327324 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.327344 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:33:16.327323599 +0000 UTC m=+83.152525123 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.327476 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.327537 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.327561 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.327582 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.327682 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.327704 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.327709 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.327744 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.327772 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.327791 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.327721 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:33:16.32771207 +0000 UTC m=+83.152913594 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.327725 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.327836 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:33:16.327812382 +0000 UTC m=+83.153014006 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.327856 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.327874 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 11:33:16.327862314 +0000 UTC m=+83.153063958 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.327914 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 11:33:16.327898255 +0000 UTC m=+83.153099779 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.407237 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.407277 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.407286 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.407301 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.407311 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:44Z","lastTransitionTime":"2025-12-09T11:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.509935 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.509972 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.509981 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.509996 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.510005 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:44Z","lastTransitionTime":"2025-12-09T11:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.554424 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.554477 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.554543 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.554577 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.554699 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.554774 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.612335 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.612392 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.612413 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.612444 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.612466 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:44Z","lastTransitionTime":"2025-12-09T11:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.714776 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.714825 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.714850 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.714867 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.714878 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:44Z","lastTransitionTime":"2025-12-09T11:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.817249 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.817292 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.817305 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.817323 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.817333 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:44Z","lastTransitionTime":"2025-12-09T11:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.851287 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovnkube-controller/2.log" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.855083 4745 scope.go:117] "RemoveContainer" containerID="bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0" Dec 09 11:32:44 crc kubenswrapper[4745]: E1209 11:32:44.855222 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.872296 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:44Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.888206 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:44Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.905543 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:44Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.919954 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.920005 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.920022 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.920045 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.920062 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:44Z","lastTransitionTime":"2025-12-09T11:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.927492 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:44Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.947680 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:44Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.966603 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:44Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.982966 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:44Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:44 crc kubenswrapper[4745]: I1209 11:32:44.996405 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:44Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.008464 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:45Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.022444 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.022493 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.022522 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.022540 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.022550 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:45Z","lastTransitionTime":"2025-12-09T11:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.032208 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:45Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.044582 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:45Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.064812 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:43Z\\\",\\\"message\\\":\\\"s for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 11:32:42.431250 6367 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431256 6367 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431073 6367 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 11:32:42.431295 6367 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:45Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.074823 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:45Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.092296 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:45Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.105419 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:45Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.117421 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:45Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.124883 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.124929 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.124945 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.124967 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.124984 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:45Z","lastTransitionTime":"2025-12-09T11:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.128927 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:45Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.227257 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.227310 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.227320 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.227336 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.227347 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:45Z","lastTransitionTime":"2025-12-09T11:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.370996 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.371067 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.371088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.371119 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.371138 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:45Z","lastTransitionTime":"2025-12-09T11:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.473180 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.473249 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.473266 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.473291 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.473306 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:45Z","lastTransitionTime":"2025-12-09T11:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.554183 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:45 crc kubenswrapper[4745]: E1209 11:32:45.554363 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.576529 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.576572 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.576580 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.576591 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.576600 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:45Z","lastTransitionTime":"2025-12-09T11:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.679400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.679468 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.679486 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.679534 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.679553 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:45Z","lastTransitionTime":"2025-12-09T11:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.781988 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.782050 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.782061 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.782077 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.782089 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:45Z","lastTransitionTime":"2025-12-09T11:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.884479 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.884551 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.884562 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.884579 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.884593 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:45Z","lastTransitionTime":"2025-12-09T11:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.986671 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.986704 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.986715 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.986731 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:45 crc kubenswrapper[4745]: I1209 11:32:45.986742 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:45Z","lastTransitionTime":"2025-12-09T11:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.089548 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.089595 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.089607 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.089624 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.089636 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:46Z","lastTransitionTime":"2025-12-09T11:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.191642 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.191680 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.191691 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.191705 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.191716 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:46Z","lastTransitionTime":"2025-12-09T11:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.279656 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.288879 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.290958 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.293719 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.293744 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.293753 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.293767 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.293781 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:46Z","lastTransitionTime":"2025-12-09T11:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.304440 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.313954 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.335728 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:43Z\\\",\\\"message\\\":\\\"s for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 11:32:42.431250 6367 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431256 6367 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431073 6367 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 11:32:42.431295 6367 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.346745 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.363377 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.378125 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.395069 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.395685 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.395725 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.395735 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.395754 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.395771 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:46Z","lastTransitionTime":"2025-12-09T11:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.404298 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.414042 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.426466 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.437342 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.448214 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.458408 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.469502 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.479602 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.488345 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:46Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.497847 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.497884 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.497896 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.497912 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.497923 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:46Z","lastTransitionTime":"2025-12-09T11:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.554243 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.554273 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:46 crc kubenswrapper[4745]: E1209 11:32:46.554356 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.554243 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:46 crc kubenswrapper[4745]: E1209 11:32:46.554424 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:46 crc kubenswrapper[4745]: E1209 11:32:46.554679 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.600432 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.600466 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.600477 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.600492 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.600503 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:46Z","lastTransitionTime":"2025-12-09T11:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.702407 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.702440 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.702450 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.702465 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.702475 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:46Z","lastTransitionTime":"2025-12-09T11:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.804587 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.804637 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.804650 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.804667 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.804678 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:46Z","lastTransitionTime":"2025-12-09T11:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.906104 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.906133 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.906143 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.906158 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:46 crc kubenswrapper[4745]: I1209 11:32:46.906168 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:46Z","lastTransitionTime":"2025-12-09T11:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.008292 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.008404 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.008423 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.008442 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.008455 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:47Z","lastTransitionTime":"2025-12-09T11:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.110615 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.110679 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.110692 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.110711 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.110722 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:47Z","lastTransitionTime":"2025-12-09T11:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.213121 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.213157 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.213167 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.213181 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.213190 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:47Z","lastTransitionTime":"2025-12-09T11:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.273699 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.273759 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.273774 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.273794 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.273804 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:47Z","lastTransitionTime":"2025-12-09T11:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:47 crc kubenswrapper[4745]: E1209 11:32:47.311895 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:47Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.315841 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.315895 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.315906 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.315927 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.315955 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:47Z","lastTransitionTime":"2025-12-09T11:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:47 crc kubenswrapper[4745]: E1209 11:32:47.330477 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:47Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.337305 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.337337 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.337350 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.337364 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.337373 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:47Z","lastTransitionTime":"2025-12-09T11:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:47 crc kubenswrapper[4745]: E1209 11:32:47.355596 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:47Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.359349 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.359380 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.359389 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.359404 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.359416 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:47Z","lastTransitionTime":"2025-12-09T11:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:47 crc kubenswrapper[4745]: E1209 11:32:47.371278 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:47Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.374729 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.374758 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.374786 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.374798 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.374809 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:47Z","lastTransitionTime":"2025-12-09T11:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:47 crc kubenswrapper[4745]: E1209 11:32:47.387226 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:47Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:47 crc kubenswrapper[4745]: E1209 11:32:47.387336 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.388473 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.388497 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.388524 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.388540 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.388551 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:47Z","lastTransitionTime":"2025-12-09T11:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.490255 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.490291 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.490302 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.490317 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.490329 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:47Z","lastTransitionTime":"2025-12-09T11:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.554405 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:47 crc kubenswrapper[4745]: E1209 11:32:47.554561 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.592451 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.592492 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.592543 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.592573 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.592592 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:47Z","lastTransitionTime":"2025-12-09T11:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.699664 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.699725 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.699748 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.699776 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.699798 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:47Z","lastTransitionTime":"2025-12-09T11:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.803091 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.803155 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.803171 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.803199 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.803215 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:47Z","lastTransitionTime":"2025-12-09T11:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.906888 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.906949 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.906965 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.906988 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:47 crc kubenswrapper[4745]: I1209 11:32:47.907007 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:47Z","lastTransitionTime":"2025-12-09T11:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.010447 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.010500 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.010552 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.010578 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.010593 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:48Z","lastTransitionTime":"2025-12-09T11:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.113912 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.113967 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.113979 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.113999 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.114015 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:48Z","lastTransitionTime":"2025-12-09T11:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.218724 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.218798 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.218815 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.218843 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.218865 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:48Z","lastTransitionTime":"2025-12-09T11:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.321930 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.322001 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.322020 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.322078 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.322108 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:48Z","lastTransitionTime":"2025-12-09T11:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.424746 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.424798 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.424809 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.424823 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.424833 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:48Z","lastTransitionTime":"2025-12-09T11:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.528077 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.528123 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.528137 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.528157 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.528172 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:48Z","lastTransitionTime":"2025-12-09T11:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.554812 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:48 crc kubenswrapper[4745]: E1209 11:32:48.554955 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.555119 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:48 crc kubenswrapper[4745]: E1209 11:32:48.555162 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.555257 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:48 crc kubenswrapper[4745]: E1209 11:32:48.555296 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.630843 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.630898 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.630912 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.630932 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.630946 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:48Z","lastTransitionTime":"2025-12-09T11:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.733472 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.733551 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.733563 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.733582 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.733591 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:48Z","lastTransitionTime":"2025-12-09T11:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.837154 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.837241 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.837259 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.837289 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.837308 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:48Z","lastTransitionTime":"2025-12-09T11:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.940878 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.940955 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.940971 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.940992 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:48 crc kubenswrapper[4745]: I1209 11:32:48.941006 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:48Z","lastTransitionTime":"2025-12-09T11:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.044854 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.044911 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.044921 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.044943 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.044956 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:49Z","lastTransitionTime":"2025-12-09T11:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.148947 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.149036 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.149062 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.149103 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.149127 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:49Z","lastTransitionTime":"2025-12-09T11:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.251873 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.251924 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.251937 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.251959 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.251975 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:49Z","lastTransitionTime":"2025-12-09T11:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.355253 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.355330 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.355353 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.355384 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.355407 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:49Z","lastTransitionTime":"2025-12-09T11:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.458279 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.458350 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.458370 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.458399 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.458419 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:49Z","lastTransitionTime":"2025-12-09T11:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.555751 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:49 crc kubenswrapper[4745]: E1209 11:32:49.555913 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.560166 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.560194 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.560205 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.560229 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.560241 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:49Z","lastTransitionTime":"2025-12-09T11:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.663108 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.663155 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.663167 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.663187 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.663199 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:49Z","lastTransitionTime":"2025-12-09T11:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.766803 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.766876 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.766895 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.766922 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.766941 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:49Z","lastTransitionTime":"2025-12-09T11:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.870657 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.870719 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.870735 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.870757 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.870770 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:49Z","lastTransitionTime":"2025-12-09T11:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.974666 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.974750 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.974767 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.974800 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:49 crc kubenswrapper[4745]: I1209 11:32:49.974819 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:49Z","lastTransitionTime":"2025-12-09T11:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.078359 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.078495 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.078557 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.078597 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.078627 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:50Z","lastTransitionTime":"2025-12-09T11:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.182775 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.182845 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.182863 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.182892 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.182917 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:50Z","lastTransitionTime":"2025-12-09T11:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.286104 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.286153 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.286162 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.286180 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.286191 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:50Z","lastTransitionTime":"2025-12-09T11:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.389504 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.389655 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.389680 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.389715 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.389740 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:50Z","lastTransitionTime":"2025-12-09T11:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.494166 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.494232 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.494249 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.494276 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.494295 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:50Z","lastTransitionTime":"2025-12-09T11:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.554216 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.554250 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.554284 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:50 crc kubenswrapper[4745]: E1209 11:32:50.554544 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:50 crc kubenswrapper[4745]: E1209 11:32:50.554627 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:50 crc kubenswrapper[4745]: E1209 11:32:50.554783 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.598348 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.598426 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.598443 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.598474 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.598493 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:50Z","lastTransitionTime":"2025-12-09T11:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.701878 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.701956 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.701989 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.702019 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.702040 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:50Z","lastTransitionTime":"2025-12-09T11:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.805385 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.805468 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.805490 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.805577 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.805605 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:50Z","lastTransitionTime":"2025-12-09T11:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.909276 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.909341 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.909355 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.909381 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:50 crc kubenswrapper[4745]: I1209 11:32:50.909397 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:50Z","lastTransitionTime":"2025-12-09T11:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.012833 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.012960 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.012994 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.013032 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.013058 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:51Z","lastTransitionTime":"2025-12-09T11:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.116340 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.116392 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.116403 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.116420 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.116433 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:51Z","lastTransitionTime":"2025-12-09T11:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.221817 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.221885 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.221904 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.221933 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.221953 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:51Z","lastTransitionTime":"2025-12-09T11:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.324873 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.324943 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.324963 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.324990 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.325008 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:51Z","lastTransitionTime":"2025-12-09T11:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.428898 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.428953 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.428965 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.428988 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.429003 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:51Z","lastTransitionTime":"2025-12-09T11:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.532028 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.532071 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.532081 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.532100 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.532110 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:51Z","lastTransitionTime":"2025-12-09T11:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.553969 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:51 crc kubenswrapper[4745]: E1209 11:32:51.554099 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.634843 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.634876 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.634885 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.634899 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.634910 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:51Z","lastTransitionTime":"2025-12-09T11:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.738765 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.738823 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.738841 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.738916 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.738935 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:51Z","lastTransitionTime":"2025-12-09T11:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.842327 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.842383 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.842400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.842424 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.842442 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:51Z","lastTransitionTime":"2025-12-09T11:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.944681 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.944777 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.944807 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.944842 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:51 crc kubenswrapper[4745]: I1209 11:32:51.944867 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:51Z","lastTransitionTime":"2025-12-09T11:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.047744 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.047946 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.048005 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.048093 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.048156 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:52Z","lastTransitionTime":"2025-12-09T11:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.150227 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.150445 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.150530 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.150714 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.150783 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:52Z","lastTransitionTime":"2025-12-09T11:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.253598 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.254089 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.254233 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.254350 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.254607 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:52Z","lastTransitionTime":"2025-12-09T11:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.357316 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.357353 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.357364 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.357380 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.357392 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:52Z","lastTransitionTime":"2025-12-09T11:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.461076 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.461131 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.461141 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.461158 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.461169 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:52Z","lastTransitionTime":"2025-12-09T11:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.553970 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.553970 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:52 crc kubenswrapper[4745]: E1209 11:32:52.554893 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.554123 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:52 crc kubenswrapper[4745]: E1209 11:32:52.554983 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:52 crc kubenswrapper[4745]: E1209 11:32:52.555150 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.564478 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.564545 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.564556 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.564571 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.564584 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:52Z","lastTransitionTime":"2025-12-09T11:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.666585 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.666626 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.666643 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.666664 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.666680 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:52Z","lastTransitionTime":"2025-12-09T11:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.769258 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.769377 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.769401 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.769432 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.769453 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:52Z","lastTransitionTime":"2025-12-09T11:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.873112 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.873147 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.873159 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.873177 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.873187 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:52Z","lastTransitionTime":"2025-12-09T11:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.977092 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.977141 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.977150 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.977167 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:52 crc kubenswrapper[4745]: I1209 11:32:52.977178 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:52Z","lastTransitionTime":"2025-12-09T11:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.080137 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.080229 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.080252 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.080280 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.080484 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:53Z","lastTransitionTime":"2025-12-09T11:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.184823 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.184892 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.184911 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.184940 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.184959 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:53Z","lastTransitionTime":"2025-12-09T11:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.288154 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.288208 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.288221 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.288244 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.288259 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:53Z","lastTransitionTime":"2025-12-09T11:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.391736 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.391788 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.391799 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.391821 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.392264 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:53Z","lastTransitionTime":"2025-12-09T11:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.496079 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.496124 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.496136 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.496152 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.496163 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:53Z","lastTransitionTime":"2025-12-09T11:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.554991 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:53 crc kubenswrapper[4745]: E1209 11:32:53.555292 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.574441 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.591631 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.598955 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.599026 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.599041 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.599061 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.599075 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:53Z","lastTransitionTime":"2025-12-09T11:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.608011 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.629653 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.647199 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.664444 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.680646 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.698684 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.704296 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.704584 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.704668 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.704703 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.704721 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:53Z","lastTransitionTime":"2025-12-09T11:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.723238 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.735365 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.760007 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:43Z\\\",\\\"message\\\":\\\"s for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 11:32:42.431250 6367 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431256 6367 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431073 6367 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 11:32:42.431295 6367 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.770667 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.783788 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.797741 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.807474 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.807554 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.807572 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.807597 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.807610 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:53Z","lastTransitionTime":"2025-12-09T11:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.812851 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39b6220f-fc17-4df5-9a6e-2a39055c1715\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96728d2c3cf7756ee0c56aef45fd0a2b64a6baea8e3a92cf8c615225db90a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://128879059f618fe71b194187d768c3a7471fed1c5f82d1c5155695c7f1eacd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d675827d644a54e0ef196be479043e4ed251387e00c279353dc82f3cfc379c50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.828863 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.843103 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.861285 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:53Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.911339 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.911387 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.911396 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.911411 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:53 crc kubenswrapper[4745]: I1209 11:32:53.911422 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:53Z","lastTransitionTime":"2025-12-09T11:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.014131 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.014204 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.014215 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.014255 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.014268 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:54Z","lastTransitionTime":"2025-12-09T11:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.116223 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.116301 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.116316 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.116335 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.116347 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:54Z","lastTransitionTime":"2025-12-09T11:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.219357 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.219388 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.219397 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.219413 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.219423 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:54Z","lastTransitionTime":"2025-12-09T11:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.322300 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.322328 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.322336 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.322348 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.322356 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:54Z","lastTransitionTime":"2025-12-09T11:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.425275 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.425337 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.425351 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.425371 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.425387 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:54Z","lastTransitionTime":"2025-12-09T11:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.527920 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.527965 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.527977 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.527993 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.528003 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:54Z","lastTransitionTime":"2025-12-09T11:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.554667 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.554723 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:54 crc kubenswrapper[4745]: E1209 11:32:54.554795 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.554819 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:54 crc kubenswrapper[4745]: E1209 11:32:54.554949 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:54 crc kubenswrapper[4745]: E1209 11:32:54.555003 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.630138 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.630171 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.630179 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.630194 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.630202 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:54Z","lastTransitionTime":"2025-12-09T11:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.732002 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.732056 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.732071 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.732309 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.732339 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:54Z","lastTransitionTime":"2025-12-09T11:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.834850 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.834879 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.834887 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.834900 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.834909 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:54Z","lastTransitionTime":"2025-12-09T11:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.936874 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.936932 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.936944 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.936964 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:54 crc kubenswrapper[4745]: I1209 11:32:54.936980 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:54Z","lastTransitionTime":"2025-12-09T11:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.039902 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.039953 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.039964 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.039980 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.039994 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:55Z","lastTransitionTime":"2025-12-09T11:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.142660 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.142692 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.142702 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.142721 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.142734 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:55Z","lastTransitionTime":"2025-12-09T11:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.245924 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.245960 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.245971 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.245986 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.245999 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:55Z","lastTransitionTime":"2025-12-09T11:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.348373 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.348405 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.348413 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.348425 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.348433 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:55Z","lastTransitionTime":"2025-12-09T11:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.450993 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.451034 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.451044 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.451060 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.451075 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:55Z","lastTransitionTime":"2025-12-09T11:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.553732 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.553774 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.553788 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.553807 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.553821 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:55Z","lastTransitionTime":"2025-12-09T11:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.554650 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:55 crc kubenswrapper[4745]: E1209 11:32:55.554802 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.656177 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.656204 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.656218 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.656231 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.656241 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:55Z","lastTransitionTime":"2025-12-09T11:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.758686 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.758719 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.758729 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.758744 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.758754 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:55Z","lastTransitionTime":"2025-12-09T11:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.861416 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.861480 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.861500 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.861542 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.861560 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:55Z","lastTransitionTime":"2025-12-09T11:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.964289 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.964336 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.964352 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.964373 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:55 crc kubenswrapper[4745]: I1209 11:32:55.964389 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:55Z","lastTransitionTime":"2025-12-09T11:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.067176 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.067218 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.067229 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.067249 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.067261 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:56Z","lastTransitionTime":"2025-12-09T11:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.169621 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.169660 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.169672 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.169688 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.169701 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:56Z","lastTransitionTime":"2025-12-09T11:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.272174 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.272202 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.272209 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.272222 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.272231 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:56Z","lastTransitionTime":"2025-12-09T11:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.374909 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.374948 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.374958 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.374976 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.374987 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:56Z","lastTransitionTime":"2025-12-09T11:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.477475 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.477530 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.477541 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.477556 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.477566 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:56Z","lastTransitionTime":"2025-12-09T11:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.553773 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.553851 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.553902 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:56 crc kubenswrapper[4745]: E1209 11:32:56.553989 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:56 crc kubenswrapper[4745]: E1209 11:32:56.554177 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:56 crc kubenswrapper[4745]: E1209 11:32:56.554286 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.555169 4745 scope.go:117] "RemoveContainer" containerID="bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0" Dec 09 11:32:56 crc kubenswrapper[4745]: E1209 11:32:56.555386 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.579753 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.579806 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.579820 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.579840 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.579857 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:56Z","lastTransitionTime":"2025-12-09T11:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.682055 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.682099 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.682114 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.682130 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.682143 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:56Z","lastTransitionTime":"2025-12-09T11:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.784247 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.784286 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.784294 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.784308 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.784317 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:56Z","lastTransitionTime":"2025-12-09T11:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.886634 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.886678 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.886689 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.886705 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.886716 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:56Z","lastTransitionTime":"2025-12-09T11:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.989747 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.989795 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.989806 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.989824 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:56 crc kubenswrapper[4745]: I1209 11:32:56.989835 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:56Z","lastTransitionTime":"2025-12-09T11:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.092342 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.092399 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.092415 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.092430 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.092442 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:57Z","lastTransitionTime":"2025-12-09T11:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.194778 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.194846 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.194866 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.194891 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.194908 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:57Z","lastTransitionTime":"2025-12-09T11:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.297011 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.297059 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.297072 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.297086 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.297095 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:57Z","lastTransitionTime":"2025-12-09T11:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.399889 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.399939 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.399956 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.399977 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.399996 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:57Z","lastTransitionTime":"2025-12-09T11:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.502426 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.502467 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.502479 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.502494 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.502523 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:57Z","lastTransitionTime":"2025-12-09T11:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.554416 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:57 crc kubenswrapper[4745]: E1209 11:32:57.554584 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.605040 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.605086 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.605109 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.605128 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.605141 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:57Z","lastTransitionTime":"2025-12-09T11:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.707201 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.707251 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.707261 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.707282 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.707298 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:57Z","lastTransitionTime":"2025-12-09T11:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.716471 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.716496 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.716516 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.716530 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.716540 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:57Z","lastTransitionTime":"2025-12-09T11:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:57 crc kubenswrapper[4745]: E1209 11:32:57.729212 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:57Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.733419 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.733465 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.733478 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.733495 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.733527 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:57Z","lastTransitionTime":"2025-12-09T11:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:57 crc kubenswrapper[4745]: E1209 11:32:57.747926 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:57Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.751525 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.751555 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.751566 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.751583 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.751592 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:57Z","lastTransitionTime":"2025-12-09T11:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:57 crc kubenswrapper[4745]: E1209 11:32:57.763837 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:57Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.766911 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.766953 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.766965 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.766984 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.766995 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:57Z","lastTransitionTime":"2025-12-09T11:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:57 crc kubenswrapper[4745]: E1209 11:32:57.780155 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:57Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.783731 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.783768 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.783779 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.783793 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.783803 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:57Z","lastTransitionTime":"2025-12-09T11:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:57 crc kubenswrapper[4745]: E1209 11:32:57.795467 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:32:57Z is after 2025-08-24T17:21:41Z" Dec 09 11:32:57 crc kubenswrapper[4745]: E1209 11:32:57.795598 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.809460 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.809493 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.809521 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.809538 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.809549 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:57Z","lastTransitionTime":"2025-12-09T11:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.936064 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.936101 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.936111 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.936127 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:57 crc kubenswrapper[4745]: I1209 11:32:57.936136 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:57Z","lastTransitionTime":"2025-12-09T11:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.037760 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.037841 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.037855 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.037873 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.037883 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:58Z","lastTransitionTime":"2025-12-09T11:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.140326 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.140370 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.140386 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.140409 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.140427 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:58Z","lastTransitionTime":"2025-12-09T11:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.242613 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.242661 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.242681 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.242708 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.242729 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:58Z","lastTransitionTime":"2025-12-09T11:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.344808 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.345111 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.345195 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.345262 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.345326 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:58Z","lastTransitionTime":"2025-12-09T11:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.447864 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.447902 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.447913 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.447931 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.447942 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:58Z","lastTransitionTime":"2025-12-09T11:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.549660 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.549922 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.549995 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.550063 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.550142 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:58Z","lastTransitionTime":"2025-12-09T11:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.554038 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.554128 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.554207 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:32:58 crc kubenswrapper[4745]: E1209 11:32:58.554306 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:32:58 crc kubenswrapper[4745]: E1209 11:32:58.554242 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:32:58 crc kubenswrapper[4745]: E1209 11:32:58.554497 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.651987 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.652014 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.652041 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.652055 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.652063 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:58Z","lastTransitionTime":"2025-12-09T11:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.754372 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.754415 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.754427 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.754444 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.754456 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:58Z","lastTransitionTime":"2025-12-09T11:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.857006 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.857041 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.857052 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.857069 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.857080 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:58Z","lastTransitionTime":"2025-12-09T11:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.958901 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.958938 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.958951 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.958966 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:58 crc kubenswrapper[4745]: I1209 11:32:58.958976 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:58Z","lastTransitionTime":"2025-12-09T11:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.060910 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.060946 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.060957 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.060974 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.060986 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:59Z","lastTransitionTime":"2025-12-09T11:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.163055 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.163094 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.163104 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.163121 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.163137 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:59Z","lastTransitionTime":"2025-12-09T11:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.265400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.265441 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.265453 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.265468 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.265479 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:59Z","lastTransitionTime":"2025-12-09T11:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.348806 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs\") pod \"network-metrics-daemon-jdv4j\" (UID: \"ea6befdd-80ca-42c2-813f-62a5cdff9605\") " pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:59 crc kubenswrapper[4745]: E1209 11:32:59.348983 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 11:32:59 crc kubenswrapper[4745]: E1209 11:32:59.349184 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs podName:ea6befdd-80ca-42c2-813f-62a5cdff9605 nodeName:}" failed. No retries permitted until 2025-12-09 11:33:31.349167606 +0000 UTC m=+98.174369130 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs") pod "network-metrics-daemon-jdv4j" (UID: "ea6befdd-80ca-42c2-813f-62a5cdff9605") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.368411 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.368687 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.368777 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.369028 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.369092 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:59Z","lastTransitionTime":"2025-12-09T11:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.471632 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.471668 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.471677 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.471691 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.471701 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:59Z","lastTransitionTime":"2025-12-09T11:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.554553 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:32:59 crc kubenswrapper[4745]: E1209 11:32:59.554715 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.573277 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.573306 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.573314 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.573325 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.573334 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:59Z","lastTransitionTime":"2025-12-09T11:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.675751 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.675979 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.675992 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.676009 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.676020 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:59Z","lastTransitionTime":"2025-12-09T11:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.778319 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.778361 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.778374 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.778390 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.778401 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:59Z","lastTransitionTime":"2025-12-09T11:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.881213 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.881246 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.881257 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.881273 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.881283 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:59Z","lastTransitionTime":"2025-12-09T11:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.983246 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.983278 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.983289 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.983303 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:32:59 crc kubenswrapper[4745]: I1209 11:32:59.983314 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:32:59Z","lastTransitionTime":"2025-12-09T11:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.085087 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.085118 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.085127 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.085141 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.085150 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:00Z","lastTransitionTime":"2025-12-09T11:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.189248 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.189314 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.189334 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.189361 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.189379 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:00Z","lastTransitionTime":"2025-12-09T11:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.292368 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.292436 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.292457 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.292486 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.292505 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:00Z","lastTransitionTime":"2025-12-09T11:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.394748 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.394799 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.394813 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.394861 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.394879 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:00Z","lastTransitionTime":"2025-12-09T11:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.498199 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.498230 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.498238 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.498258 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.498267 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:00Z","lastTransitionTime":"2025-12-09T11:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.554824 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.554867 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.554944 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:00 crc kubenswrapper[4745]: E1209 11:33:00.554942 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:00 crc kubenswrapper[4745]: E1209 11:33:00.555034 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:00 crc kubenswrapper[4745]: E1209 11:33:00.555103 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.601070 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.601117 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.601133 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.601159 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.601175 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:00Z","lastTransitionTime":"2025-12-09T11:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.703447 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.703493 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.703529 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.703549 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.703563 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:00Z","lastTransitionTime":"2025-12-09T11:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.805362 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.805432 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.805450 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.805466 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.805477 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:00Z","lastTransitionTime":"2025-12-09T11:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.907539 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.907596 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.907607 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.907622 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.907634 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:00Z","lastTransitionTime":"2025-12-09T11:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.950588 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r6gmj_1002b34d-f671-4b20-bf4f-492ce3295cc4/kube-multus/0.log" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.950631 4745 generic.go:334] "Generic (PLEG): container finished" podID="1002b34d-f671-4b20-bf4f-492ce3295cc4" containerID="c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f" exitCode=1 Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.950658 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r6gmj" event={"ID":"1002b34d-f671-4b20-bf4f-492ce3295cc4","Type":"ContainerDied","Data":"c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f"} Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.950956 4745 scope.go:117] "RemoveContainer" containerID="c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.967447 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:00Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.980606 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:00Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:00 crc kubenswrapper[4745]: I1209 11:33:00.992939 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:00Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.008862 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.010152 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.010176 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.010191 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.010206 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.010216 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:01Z","lastTransitionTime":"2025-12-09T11:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.022757 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.035193 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:33:00Z\\\",\\\"message\\\":\\\"2025-12-09T11:32:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_211073af-0d9e-4546-9f6c-2d85b8bfb2bc\\\\n2025-12-09T11:32:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_211073af-0d9e-4546-9f6c-2d85b8bfb2bc to /host/opt/cni/bin/\\\\n2025-12-09T11:32:15Z [verbose] multus-daemon started\\\\n2025-12-09T11:32:15Z [verbose] Readiness Indicator file check\\\\n2025-12-09T11:33:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.045666 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.056890 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.070746 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.083917 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.093569 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.111920 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.111955 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.111963 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.111977 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.111985 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:01Z","lastTransitionTime":"2025-12-09T11:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.115889 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:43Z\\\",\\\"message\\\":\\\"s for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 11:32:42.431250 6367 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431256 6367 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431073 6367 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 11:32:42.431295 6367 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.125863 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.143103 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.156957 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.167281 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39b6220f-fc17-4df5-9a6e-2a39055c1715\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96728d2c3cf7756ee0c56aef45fd0a2b64a6baea8e3a92cf8c615225db90a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://128879059f618fe71b194187d768c3a7471fed1c5f82d1c5155695c7f1eacd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d675827d644a54e0ef196be479043e4ed251387e00c279353dc82f3cfc379c50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.178829 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.194204 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.213790 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.214001 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.214015 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.214028 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.214037 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:01Z","lastTransitionTime":"2025-12-09T11:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.316892 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.316942 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.316959 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.316982 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.317000 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:01Z","lastTransitionTime":"2025-12-09T11:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.419744 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.419782 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.419791 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.419807 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.419816 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:01Z","lastTransitionTime":"2025-12-09T11:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.522248 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.522288 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.522299 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.522318 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.522336 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:01Z","lastTransitionTime":"2025-12-09T11:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.554810 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:01 crc kubenswrapper[4745]: E1209 11:33:01.554943 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.624958 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.625269 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.625343 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.625407 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.625464 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:01Z","lastTransitionTime":"2025-12-09T11:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.727214 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.727443 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.727528 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.727602 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.727660 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:01Z","lastTransitionTime":"2025-12-09T11:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.830743 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.830785 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.830796 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.830813 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.830822 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:01Z","lastTransitionTime":"2025-12-09T11:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.933313 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.933364 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.933380 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.933397 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.933410 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:01Z","lastTransitionTime":"2025-12-09T11:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.954272 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r6gmj_1002b34d-f671-4b20-bf4f-492ce3295cc4/kube-multus/0.log" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.954540 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r6gmj" event={"ID":"1002b34d-f671-4b20-bf4f-492ce3295cc4","Type":"ContainerStarted","Data":"e88ce3abcb49b7d6904538b5b2ce3a0a151159de54b316e2824c860f29b18a38"} Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.983859 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:01 crc kubenswrapper[4745]: I1209 11:33:01.997230 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:01Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.006917 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39b6220f-fc17-4df5-9a6e-2a39055c1715\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96728d2c3cf7756ee0c56aef45fd0a2b64a6baea8e3a92cf8c615225db90a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://128879059f618fe71b194187d768c3a7471fed1c5f82d1c5155695c7f1eacd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d675827d644a54e0ef196be479043e4ed251387e00c279353dc82f3cfc379c50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:02Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.018809 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:02Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.029993 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:02Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.036099 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.036144 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.036152 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.036165 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.036174 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:02Z","lastTransitionTime":"2025-12-09T11:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.046731 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:02Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.059564 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:02Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.070317 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:02Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.082228 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:02Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.095989 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:02Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.108594 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88ce3abcb49b7d6904538b5b2ce3a0a151159de54b316e2824c860f29b18a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:33:00Z\\\",\\\"message\\\":\\\"2025-12-09T11:32:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_211073af-0d9e-4546-9f6c-2d85b8bfb2bc\\\\n2025-12-09T11:32:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_211073af-0d9e-4546-9f6c-2d85b8bfb2bc to /host/opt/cni/bin/\\\\n2025-12-09T11:32:15Z [verbose] multus-daemon started\\\\n2025-12-09T11:32:15Z [verbose] Readiness Indicator file check\\\\n2025-12-09T11:33:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:02Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.120218 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:02Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.129548 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:02Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.138879 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.138916 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.138933 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.138954 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.138966 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:02Z","lastTransitionTime":"2025-12-09T11:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.142482 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:02Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.161042 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:02Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.174173 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:02Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.195977 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:43Z\\\",\\\"message\\\":\\\"s for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 11:32:42.431250 6367 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431256 6367 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431073 6367 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 11:32:42.431295 6367 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:02Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.207284 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:02Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.241041 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.241169 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.241228 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.241252 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.241392 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:02Z","lastTransitionTime":"2025-12-09T11:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.344091 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.344158 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.344180 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.344212 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.344241 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:02Z","lastTransitionTime":"2025-12-09T11:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.447104 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.447146 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.447155 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.447169 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.447178 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:02Z","lastTransitionTime":"2025-12-09T11:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.549710 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.549750 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.549764 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.549783 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.549796 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:02Z","lastTransitionTime":"2025-12-09T11:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.553926 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.553954 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:02 crc kubenswrapper[4745]: E1209 11:33:02.554019 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.554043 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:02 crc kubenswrapper[4745]: E1209 11:33:02.554134 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:02 crc kubenswrapper[4745]: E1209 11:33:02.554315 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.652814 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.652854 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.652865 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.652882 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.652896 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:02Z","lastTransitionTime":"2025-12-09T11:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.755781 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.755820 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.755830 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.755845 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.755853 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:02Z","lastTransitionTime":"2025-12-09T11:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.858387 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.858416 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.858425 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.858437 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.858445 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:02Z","lastTransitionTime":"2025-12-09T11:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.960149 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.960181 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.960190 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.960204 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:02 crc kubenswrapper[4745]: I1209 11:33:02.960214 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:02Z","lastTransitionTime":"2025-12-09T11:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.062542 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.062594 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.062611 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.062627 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.062637 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:03Z","lastTransitionTime":"2025-12-09T11:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.165845 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.165879 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.165888 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.165902 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.165912 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:03Z","lastTransitionTime":"2025-12-09T11:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.268667 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.268703 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.268713 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.268725 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.268734 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:03Z","lastTransitionTime":"2025-12-09T11:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.371368 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.371406 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.371414 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.371429 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.371440 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:03Z","lastTransitionTime":"2025-12-09T11:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.473886 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.473926 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.473934 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.473953 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.473966 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:03Z","lastTransitionTime":"2025-12-09T11:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.554040 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:03 crc kubenswrapper[4745]: E1209 11:33:03.554185 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.565610 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.576420 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.576458 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.576470 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.576487 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.576498 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:03Z","lastTransitionTime":"2025-12-09T11:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.579816 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.589613 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.606408 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:43Z\\\",\\\"message\\\":\\\"s for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 11:32:42.431250 6367 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431256 6367 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431073 6367 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 11:32:42.431295 6367 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.615946 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.633206 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.646086 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.657125 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39b6220f-fc17-4df5-9a6e-2a39055c1715\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96728d2c3cf7756ee0c56aef45fd0a2b64a6baea8e3a92cf8c615225db90a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://128879059f618fe71b194187d768c3a7471fed1c5f82d1c5155695c7f1eacd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d675827d644a54e0ef196be479043e4ed251387e00c279353dc82f3cfc379c50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.668584 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.679165 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.679204 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.679213 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.679228 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.679238 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:03Z","lastTransitionTime":"2025-12-09T11:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.679847 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.691854 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.706595 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.719471 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.736676 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.749620 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.761998 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88ce3abcb49b7d6904538b5b2ce3a0a151159de54b316e2824c860f29b18a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:33:00Z\\\",\\\"message\\\":\\\"2025-12-09T11:32:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_211073af-0d9e-4546-9f6c-2d85b8bfb2bc\\\\n2025-12-09T11:32:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_211073af-0d9e-4546-9f6c-2d85b8bfb2bc to /host/opt/cni/bin/\\\\n2025-12-09T11:32:15Z [verbose] multus-daemon started\\\\n2025-12-09T11:32:15Z [verbose] Readiness Indicator file check\\\\n2025-12-09T11:33:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.771202 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.779786 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:03Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.781226 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.781254 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.781265 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.781282 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.781293 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:03Z","lastTransitionTime":"2025-12-09T11:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.883182 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.883232 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.883244 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.883262 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.883275 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:03Z","lastTransitionTime":"2025-12-09T11:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.985939 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.985981 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.985993 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.986008 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:03 crc kubenswrapper[4745]: I1209 11:33:03.986021 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:03Z","lastTransitionTime":"2025-12-09T11:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.088216 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.088252 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.088261 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.088273 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.088282 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:04Z","lastTransitionTime":"2025-12-09T11:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.190446 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.190492 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.190503 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.190535 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.190549 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:04Z","lastTransitionTime":"2025-12-09T11:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.292274 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.292301 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.292309 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.292323 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.292332 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:04Z","lastTransitionTime":"2025-12-09T11:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.394638 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.394676 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.394684 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.394698 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.394707 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:04Z","lastTransitionTime":"2025-12-09T11:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.497021 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.497056 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.497067 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.497097 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.497107 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:04Z","lastTransitionTime":"2025-12-09T11:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.554800 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.554883 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:04 crc kubenswrapper[4745]: E1209 11:33:04.554957 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.554809 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:04 crc kubenswrapper[4745]: E1209 11:33:04.555031 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:04 crc kubenswrapper[4745]: E1209 11:33:04.555095 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.599787 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.599830 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.599842 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.599858 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.599869 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:04Z","lastTransitionTime":"2025-12-09T11:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.701984 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.702022 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.702034 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.702051 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.702062 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:04Z","lastTransitionTime":"2025-12-09T11:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.804076 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.804123 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.804132 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.804149 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.804159 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:04Z","lastTransitionTime":"2025-12-09T11:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.906631 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.906672 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.906681 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.906697 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:04 crc kubenswrapper[4745]: I1209 11:33:04.906709 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:04Z","lastTransitionTime":"2025-12-09T11:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.009244 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.009476 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.009560 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.009637 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.009693 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:05Z","lastTransitionTime":"2025-12-09T11:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.112237 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.112272 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.112281 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.112293 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.112302 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:05Z","lastTransitionTime":"2025-12-09T11:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.214324 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.214361 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.214370 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.214385 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.214395 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:05Z","lastTransitionTime":"2025-12-09T11:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.317842 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.317876 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.317886 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.317900 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.317910 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:05Z","lastTransitionTime":"2025-12-09T11:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.420289 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.420349 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.420361 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.420376 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.420388 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:05Z","lastTransitionTime":"2025-12-09T11:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.522065 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.522105 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.522115 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.522132 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.522142 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:05Z","lastTransitionTime":"2025-12-09T11:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.554175 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:05 crc kubenswrapper[4745]: E1209 11:33:05.554305 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.624245 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.624281 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.624291 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.624306 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.624316 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:05Z","lastTransitionTime":"2025-12-09T11:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.726752 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.726821 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.726843 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.726869 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.726885 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:05Z","lastTransitionTime":"2025-12-09T11:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.829403 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.829446 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.829462 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.829480 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.829491 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:05Z","lastTransitionTime":"2025-12-09T11:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.933533 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.933573 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.933584 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.933600 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:05 crc kubenswrapper[4745]: I1209 11:33:05.933611 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:05Z","lastTransitionTime":"2025-12-09T11:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.035412 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.035460 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.035490 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.035542 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.035577 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:06Z","lastTransitionTime":"2025-12-09T11:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.137836 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.137866 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.137875 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.137889 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.137898 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:06Z","lastTransitionTime":"2025-12-09T11:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.240704 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.240745 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.240756 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.240771 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.240783 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:06Z","lastTransitionTime":"2025-12-09T11:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.343258 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.343298 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.343309 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.343323 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.343332 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:06Z","lastTransitionTime":"2025-12-09T11:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.446173 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.446218 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.446237 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.446258 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.446271 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:06Z","lastTransitionTime":"2025-12-09T11:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.548645 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.548737 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.548752 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.548767 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.548777 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:06Z","lastTransitionTime":"2025-12-09T11:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.554693 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:06 crc kubenswrapper[4745]: E1209 11:33:06.554898 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.554694 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:06 crc kubenswrapper[4745]: E1209 11:33:06.555120 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.554693 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:06 crc kubenswrapper[4745]: E1209 11:33:06.555341 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.650765 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.650833 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.650855 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.650882 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.650902 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:06Z","lastTransitionTime":"2025-12-09T11:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.753383 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.753421 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.753432 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.753446 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.753457 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:06Z","lastTransitionTime":"2025-12-09T11:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.856207 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.856243 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.856252 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.856264 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.856275 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:06Z","lastTransitionTime":"2025-12-09T11:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.959589 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.959630 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.959644 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.959666 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:06 crc kubenswrapper[4745]: I1209 11:33:06.959683 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:06Z","lastTransitionTime":"2025-12-09T11:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.062532 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.062780 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.062860 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.062921 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.062974 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:07Z","lastTransitionTime":"2025-12-09T11:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.165614 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.165884 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.166113 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.166327 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.166502 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:07Z","lastTransitionTime":"2025-12-09T11:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.269183 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.269540 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.269656 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.269747 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.269831 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:07Z","lastTransitionTime":"2025-12-09T11:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.371982 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.372311 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.372408 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.372550 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.372630 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:07Z","lastTransitionTime":"2025-12-09T11:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.475181 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.475459 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.475571 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.475650 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.475732 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:07Z","lastTransitionTime":"2025-12-09T11:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.560194 4745 scope.go:117] "RemoveContainer" containerID="bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.561048 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:07 crc kubenswrapper[4745]: E1209 11:33:07.561464 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.579242 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.579484 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.579660 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.579804 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.579924 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:07Z","lastTransitionTime":"2025-12-09T11:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.682051 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.682347 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.682434 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.682527 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.682601 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:07Z","lastTransitionTime":"2025-12-09T11:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.785574 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.785615 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.785623 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.785636 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.785647 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:07Z","lastTransitionTime":"2025-12-09T11:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.825469 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.825532 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.825546 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.825564 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.825577 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:07Z","lastTransitionTime":"2025-12-09T11:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:07 crc kubenswrapper[4745]: E1209 11:33:07.838648 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:07Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.841592 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.842117 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.842291 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.842376 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.842456 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:07Z","lastTransitionTime":"2025-12-09T11:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:07 crc kubenswrapper[4745]: E1209 11:33:07.854342 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:07Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.858091 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.858145 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.858158 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.858176 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.858187 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:07Z","lastTransitionTime":"2025-12-09T11:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:07 crc kubenswrapper[4745]: E1209 11:33:07.869709 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:07Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.875321 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.875373 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.875385 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.875400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.875410 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:07Z","lastTransitionTime":"2025-12-09T11:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:07 crc kubenswrapper[4745]: E1209 11:33:07.887680 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:07Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.891436 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.891479 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.891490 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.891523 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.891536 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:07Z","lastTransitionTime":"2025-12-09T11:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:07 crc kubenswrapper[4745]: E1209 11:33:07.903269 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:07Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:07 crc kubenswrapper[4745]: E1209 11:33:07.903434 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.908683 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.908725 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.908739 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.908760 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:07 crc kubenswrapper[4745]: I1209 11:33:07.908772 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:07Z","lastTransitionTime":"2025-12-09T11:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.010759 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.010788 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.010800 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.010815 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.010825 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:08Z","lastTransitionTime":"2025-12-09T11:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.114109 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.114181 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.114204 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.114236 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.114259 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:08Z","lastTransitionTime":"2025-12-09T11:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.216899 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.216958 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.216968 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.216983 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.216995 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:08Z","lastTransitionTime":"2025-12-09T11:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.320192 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.320250 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.320269 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.320292 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.320309 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:08Z","lastTransitionTime":"2025-12-09T11:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.423106 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.423175 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.423199 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.423230 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.423252 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:08Z","lastTransitionTime":"2025-12-09T11:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.526225 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.526263 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.526277 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.526297 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.526313 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:08Z","lastTransitionTime":"2025-12-09T11:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.554755 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.554819 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.554914 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:08 crc kubenswrapper[4745]: E1209 11:33:08.555062 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:08 crc kubenswrapper[4745]: E1209 11:33:08.555139 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:08 crc kubenswrapper[4745]: E1209 11:33:08.555221 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.628262 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.628295 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.628303 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.628315 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.628346 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:08Z","lastTransitionTime":"2025-12-09T11:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.730504 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.730556 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.730566 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.730577 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.730586 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:08Z","lastTransitionTime":"2025-12-09T11:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.833060 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.833098 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.833108 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.833125 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.833136 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:08Z","lastTransitionTime":"2025-12-09T11:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.935588 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.935653 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.935664 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.935680 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.935689 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:08Z","lastTransitionTime":"2025-12-09T11:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.974706 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovnkube-controller/2.log" Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.977608 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerStarted","Data":"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4"} Dec 09 11:33:08 crc kubenswrapper[4745]: I1209 11:33:08.978123 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.010093 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.022048 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.032006 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39b6220f-fc17-4df5-9a6e-2a39055c1715\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96728d2c3cf7756ee0c56aef45fd0a2b64a6baea8e3a92cf8c615225db90a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://128879059f618fe71b194187d768c3a7471fed1c5f82d1c5155695c7f1eacd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d675827d644a54e0ef196be479043e4ed251387e00c279353dc82f3cfc379c50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.037561 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.037589 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.037597 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.037609 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.037617 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:09Z","lastTransitionTime":"2025-12-09T11:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.045877 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.056954 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.066928 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.077224 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.087826 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.097454 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.107851 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.118151 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88ce3abcb49b7d6904538b5b2ce3a0a151159de54b316e2824c860f29b18a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:33:00Z\\\",\\\"message\\\":\\\"2025-12-09T11:32:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_211073af-0d9e-4546-9f6c-2d85b8bfb2bc\\\\n2025-12-09T11:32:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_211073af-0d9e-4546-9f6c-2d85b8bfb2bc to /host/opt/cni/bin/\\\\n2025-12-09T11:32:15Z [verbose] multus-daemon started\\\\n2025-12-09T11:32:15Z [verbose] Readiness Indicator file check\\\\n2025-12-09T11:33:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.128067 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.137157 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.139492 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.139548 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.139559 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.139578 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.139590 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:09Z","lastTransitionTime":"2025-12-09T11:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.148855 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.162310 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.171698 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.191811 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:43Z\\\",\\\"message\\\":\\\"s for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 11:32:42.431250 6367 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431256 6367 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431073 6367 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 11:32:42.431295 6367 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.203795 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.241725 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.241757 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.241766 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.241781 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.241790 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:09Z","lastTransitionTime":"2025-12-09T11:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.343648 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.343677 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.343685 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.343696 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.343703 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:09Z","lastTransitionTime":"2025-12-09T11:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.445556 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.445595 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.445605 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.445618 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.445627 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:09Z","lastTransitionTime":"2025-12-09T11:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.548591 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.548628 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.548644 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.548665 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.548682 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:09Z","lastTransitionTime":"2025-12-09T11:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.554326 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:09 crc kubenswrapper[4745]: E1209 11:33:09.554468 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.650236 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.650271 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.650279 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.650294 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.650304 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:09Z","lastTransitionTime":"2025-12-09T11:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.752979 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.753012 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.753022 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.753037 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.753050 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:09Z","lastTransitionTime":"2025-12-09T11:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.855937 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.855970 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.855978 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.855994 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.856003 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:09Z","lastTransitionTime":"2025-12-09T11:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.958299 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.958336 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.958348 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.958365 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.958376 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:09Z","lastTransitionTime":"2025-12-09T11:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.981409 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovnkube-controller/3.log" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.981947 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovnkube-controller/2.log" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.984212 4745 generic.go:334] "Generic (PLEG): container finished" podID="ac484d76-f5da-4880-868d-1e2e5289c025" containerID="9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4" exitCode=1 Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.984254 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerDied","Data":"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4"} Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.984286 4745 scope.go:117] "RemoveContainer" containerID="bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.984781 4745 scope.go:117] "RemoveContainer" containerID="9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4" Dec 09 11:33:09 crc kubenswrapper[4745]: E1209 11:33:09.984921 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" Dec 09 11:33:09 crc kubenswrapper[4745]: I1209 11:33:09.996846 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:09Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.008904 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.020284 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.032803 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.045788 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.056671 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88ce3abcb49b7d6904538b5b2ce3a0a151159de54b316e2824c860f29b18a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:33:00Z\\\",\\\"message\\\":\\\"2025-12-09T11:32:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_211073af-0d9e-4546-9f6c-2d85b8bfb2bc\\\\n2025-12-09T11:32:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_211073af-0d9e-4546-9f6c-2d85b8bfb2bc to /host/opt/cni/bin/\\\\n2025-12-09T11:32:15Z [verbose] multus-daemon started\\\\n2025-12-09T11:32:15Z [verbose] Readiness Indicator file check\\\\n2025-12-09T11:33:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.060663 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.060701 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.060713 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.060730 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.060741 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:10Z","lastTransitionTime":"2025-12-09T11:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.068098 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.077716 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.089140 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.102536 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.112656 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.129797 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9246f73d412e90b3c219905d757f729907805b0893747b64b37613f87e1b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:32:43Z\\\",\\\"message\\\":\\\"s for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 11:32:42.431250 6367 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431256 6367 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1209 11:32:42.431073 6367 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1209 11:32:42.431295 6367 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:33:09Z\\\",\\\"message\\\":\\\"33:09.217592 6773 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1209 11:33:09.217583 6773 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-vwrlh\\\\nI1209 11:33:09.217601 6773 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1209 11:33:09.217595 6773 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-jdv4j before timer (time: 2025-12-09 11:33:10.214550815 +0000 UTC m=+1.505266697): skip\\\\nI1209 11:33:09.217609 6773 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI1209 11:33:09.217617 6773 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1209 11:33:09.217624 6773 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1209 11:33:09.217638 6773 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 166.225µs)\\\\nI1209 11:33:09.217586 6773 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1209 11:33:09.217723 6773 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.139829 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.160346 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.162851 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.162885 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.162895 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.162910 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.162925 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:10Z","lastTransitionTime":"2025-12-09T11:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.173906 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.184524 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39b6220f-fc17-4df5-9a6e-2a39055c1715\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96728d2c3cf7756ee0c56aef45fd0a2b64a6baea8e3a92cf8c615225db90a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://128879059f618fe71b194187d768c3a7471fed1c5f82d1c5155695c7f1eacd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d675827d644a54e0ef196be479043e4ed251387e00c279353dc82f3cfc379c50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.196271 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.206087 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:10Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.264720 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.264756 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.264763 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.264777 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.264786 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:10Z","lastTransitionTime":"2025-12-09T11:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.368935 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.369001 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.369019 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.369040 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.369057 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:10Z","lastTransitionTime":"2025-12-09T11:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.471440 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.471490 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.471527 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.471551 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.471567 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:10Z","lastTransitionTime":"2025-12-09T11:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.554346 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.554354 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.554356 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:10 crc kubenswrapper[4745]: E1209 11:33:10.554701 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:10 crc kubenswrapper[4745]: E1209 11:33:10.554745 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:10 crc kubenswrapper[4745]: E1209 11:33:10.554494 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.574385 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.574412 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.574420 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.574433 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.574441 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:10Z","lastTransitionTime":"2025-12-09T11:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.677785 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.677815 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.677826 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.677839 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.677849 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:10Z","lastTransitionTime":"2025-12-09T11:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.780393 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.780449 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.780462 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.780480 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.780492 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:10Z","lastTransitionTime":"2025-12-09T11:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.882303 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.882347 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.882359 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.882375 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.882387 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:10Z","lastTransitionTime":"2025-12-09T11:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.985329 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.985374 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.985385 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.985434 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.985468 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:10Z","lastTransitionTime":"2025-12-09T11:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.991631 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovnkube-controller/3.log" Dec 09 11:33:10 crc kubenswrapper[4745]: I1209 11:33:10.995234 4745 scope.go:117] "RemoveContainer" containerID="9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4" Dec 09 11:33:10 crc kubenswrapper[4745]: E1209 11:33:10.995384 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.005197 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.018215 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.032574 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.042264 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.061450 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:33:09Z\\\",\\\"message\\\":\\\"33:09.217592 6773 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1209 11:33:09.217583 6773 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-vwrlh\\\\nI1209 11:33:09.217601 6773 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1209 11:33:09.217595 6773 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-jdv4j before timer (time: 2025-12-09 11:33:10.214550815 +0000 UTC m=+1.505266697): skip\\\\nI1209 11:33:09.217609 6773 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI1209 11:33:09.217617 6773 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1209 11:33:09.217624 6773 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1209 11:33:09.217638 6773 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 166.225µs)\\\\nI1209 11:33:09.217586 6773 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1209 11:33:09.217723 6773 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:33:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.072328 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.088438 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.088468 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.088476 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.088489 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.088497 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:11Z","lastTransitionTime":"2025-12-09T11:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.090092 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.102406 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.115466 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39b6220f-fc17-4df5-9a6e-2a39055c1715\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96728d2c3cf7756ee0c56aef45fd0a2b64a6baea8e3a92cf8c615225db90a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://128879059f618fe71b194187d768c3a7471fed1c5f82d1c5155695c7f1eacd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d675827d644a54e0ef196be479043e4ed251387e00c279353dc82f3cfc379c50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.127894 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.138973 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.150424 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.160173 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.170087 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.180023 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.190533 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.190564 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.190572 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.190585 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.190595 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:11Z","lastTransitionTime":"2025-12-09T11:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.194239 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.206718 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88ce3abcb49b7d6904538b5b2ce3a0a151159de54b316e2824c860f29b18a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:33:00Z\\\",\\\"message\\\":\\\"2025-12-09T11:32:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_211073af-0d9e-4546-9f6c-2d85b8bfb2bc\\\\n2025-12-09T11:32:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_211073af-0d9e-4546-9f6c-2d85b8bfb2bc to /host/opt/cni/bin/\\\\n2025-12-09T11:32:15Z [verbose] multus-daemon started\\\\n2025-12-09T11:32:15Z [verbose] Readiness Indicator file check\\\\n2025-12-09T11:33:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.217964 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:11Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.292844 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.292886 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.292896 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.292912 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.292923 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:11Z","lastTransitionTime":"2025-12-09T11:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.394970 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.395024 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.395034 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.395048 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.395058 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:11Z","lastTransitionTime":"2025-12-09T11:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.497307 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.497342 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.497353 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.497367 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.497378 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:11Z","lastTransitionTime":"2025-12-09T11:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.554555 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:11 crc kubenswrapper[4745]: E1209 11:33:11.554693 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.599381 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.599426 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.599437 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.599451 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.599460 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:11Z","lastTransitionTime":"2025-12-09T11:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.701990 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.702044 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.702053 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.702066 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.702073 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:11Z","lastTransitionTime":"2025-12-09T11:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.805736 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.805804 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.805827 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.805855 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.805877 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:11Z","lastTransitionTime":"2025-12-09T11:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.907752 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.907790 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.907798 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.907814 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:11 crc kubenswrapper[4745]: I1209 11:33:11.907824 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:11Z","lastTransitionTime":"2025-12-09T11:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.010055 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.010108 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.010124 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.010145 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.010160 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:12Z","lastTransitionTime":"2025-12-09T11:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.112946 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.112995 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.113011 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.113033 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.113052 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:12Z","lastTransitionTime":"2025-12-09T11:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.215148 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.215182 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.215190 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.215203 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.215211 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:12Z","lastTransitionTime":"2025-12-09T11:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.317332 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.317369 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.317378 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.317391 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.317399 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:12Z","lastTransitionTime":"2025-12-09T11:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.419440 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.419479 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.419488 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.419500 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.419523 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:12Z","lastTransitionTime":"2025-12-09T11:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.522876 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.522957 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.522977 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.523008 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.523030 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:12Z","lastTransitionTime":"2025-12-09T11:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.554562 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.554627 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.554688 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:12 crc kubenswrapper[4745]: E1209 11:33:12.554775 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:12 crc kubenswrapper[4745]: E1209 11:33:12.554880 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:12 crc kubenswrapper[4745]: E1209 11:33:12.554951 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.627332 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.627402 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.627430 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.627464 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.627484 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:12Z","lastTransitionTime":"2025-12-09T11:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.731134 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.731182 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.731191 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.731211 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.731221 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:12Z","lastTransitionTime":"2025-12-09T11:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.834150 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.834212 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.834236 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.834265 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.834284 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:12Z","lastTransitionTime":"2025-12-09T11:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.936860 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.936909 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.936919 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.936935 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:12 crc kubenswrapper[4745]: I1209 11:33:12.936947 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:12Z","lastTransitionTime":"2025-12-09T11:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.038621 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.038696 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.038710 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.038727 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.038739 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:13Z","lastTransitionTime":"2025-12-09T11:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.141099 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.141387 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.141395 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.141407 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.141414 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:13Z","lastTransitionTime":"2025-12-09T11:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.244017 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.244059 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.244070 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.244086 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.244098 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:13Z","lastTransitionTime":"2025-12-09T11:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.346244 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.346280 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.346289 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.346304 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.346312 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:13Z","lastTransitionTime":"2025-12-09T11:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.448900 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.449158 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.449231 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.449312 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.449382 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:13Z","lastTransitionTime":"2025-12-09T11:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.551173 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.551234 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.551250 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.551275 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.551294 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:13Z","lastTransitionTime":"2025-12-09T11:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.554593 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:13 crc kubenswrapper[4745]: E1209 11:33:13.554778 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.564314 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.569860 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.583925 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.595702 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.608322 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.630929 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.644965 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88ce3abcb49b7d6904538b5b2ce3a0a151159de54b316e2824c860f29b18a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:33:00Z\\\",\\\"message\\\":\\\"2025-12-09T11:32:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_211073af-0d9e-4546-9f6c-2d85b8bfb2bc\\\\n2025-12-09T11:32:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_211073af-0d9e-4546-9f6c-2d85b8bfb2bc to /host/opt/cni/bin/\\\\n2025-12-09T11:32:15Z [verbose] multus-daemon started\\\\n2025-12-09T11:32:15Z [verbose] Readiness Indicator file check\\\\n2025-12-09T11:33:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.653705 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.653833 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.653923 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.654005 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.654089 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:13Z","lastTransitionTime":"2025-12-09T11:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.655711 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.664881 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.675502 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.687898 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.703859 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.720182 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:33:09Z\\\",\\\"message\\\":\\\"33:09.217592 6773 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1209 11:33:09.217583 6773 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-vwrlh\\\\nI1209 11:33:09.217601 6773 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1209 11:33:09.217595 6773 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-jdv4j before timer (time: 2025-12-09 11:33:10.214550815 +0000 UTC m=+1.505266697): skip\\\\nI1209 11:33:09.217609 6773 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI1209 11:33:09.217617 6773 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1209 11:33:09.217624 6773 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1209 11:33:09.217638 6773 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 166.225µs)\\\\nI1209 11:33:09.217586 6773 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1209 11:33:09.217723 6773 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:33:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.729758 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.747181 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.756383 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.756430 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.756441 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.756460 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.756472 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:13Z","lastTransitionTime":"2025-12-09T11:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.761883 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.776070 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39b6220f-fc17-4df5-9a6e-2a39055c1715\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96728d2c3cf7756ee0c56aef45fd0a2b64a6baea8e3a92cf8c615225db90a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://128879059f618fe71b194187d768c3a7471fed1c5f82d1c5155695c7f1eacd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d675827d644a54e0ef196be479043e4ed251387e00c279353dc82f3cfc379c50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.788023 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.798579 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:13Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.858849 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.858894 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.858906 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.858923 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.858935 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:13Z","lastTransitionTime":"2025-12-09T11:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.961149 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.961397 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.961461 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.961542 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:13 crc kubenswrapper[4745]: I1209 11:33:13.961720 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:13Z","lastTransitionTime":"2025-12-09T11:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.063856 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.063888 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.063898 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.063910 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.063919 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:14Z","lastTransitionTime":"2025-12-09T11:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.167184 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.167228 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.167237 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.167251 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.167260 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:14Z","lastTransitionTime":"2025-12-09T11:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.269488 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.269550 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.269565 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.269581 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.269591 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:14Z","lastTransitionTime":"2025-12-09T11:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.371901 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.371939 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.371950 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.371965 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.371977 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:14Z","lastTransitionTime":"2025-12-09T11:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.473870 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.473917 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.473928 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.473944 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.473955 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:14Z","lastTransitionTime":"2025-12-09T11:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.554138 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:14 crc kubenswrapper[4745]: E1209 11:33:14.554242 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.554139 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.554139 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:14 crc kubenswrapper[4745]: E1209 11:33:14.554302 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:14 crc kubenswrapper[4745]: E1209 11:33:14.554569 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.576529 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.576562 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.576572 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.576586 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.576598 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:14Z","lastTransitionTime":"2025-12-09T11:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.679000 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.679035 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.679042 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.679058 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.679066 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:14Z","lastTransitionTime":"2025-12-09T11:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.781176 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.781216 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.781230 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.781272 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.781294 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:14Z","lastTransitionTime":"2025-12-09T11:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.883650 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.883693 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.883703 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.883751 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.883764 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:14Z","lastTransitionTime":"2025-12-09T11:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.986279 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.986327 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.986341 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.986359 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:14 crc kubenswrapper[4745]: I1209 11:33:14.986370 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:14Z","lastTransitionTime":"2025-12-09T11:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.089095 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.089147 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.089156 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.089169 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.089179 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:15Z","lastTransitionTime":"2025-12-09T11:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.191711 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.191771 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.191787 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.191810 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.191828 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:15Z","lastTransitionTime":"2025-12-09T11:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.294884 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.295344 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.295500 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.295738 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.295867 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:15Z","lastTransitionTime":"2025-12-09T11:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.398739 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.398806 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.398825 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.398851 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.398875 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:15Z","lastTransitionTime":"2025-12-09T11:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.502656 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.502702 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.502718 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.502744 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.502762 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:15Z","lastTransitionTime":"2025-12-09T11:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.554644 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:15 crc kubenswrapper[4745]: E1209 11:33:15.554788 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.605274 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.605318 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.605332 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.605351 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.605366 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:15Z","lastTransitionTime":"2025-12-09T11:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.707434 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.707789 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.707878 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.707984 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.708092 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:15Z","lastTransitionTime":"2025-12-09T11:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.810738 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.810811 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.810831 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.810862 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.810883 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:15Z","lastTransitionTime":"2025-12-09T11:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.913981 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.914270 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.914350 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.914434 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:15 crc kubenswrapper[4745]: I1209 11:33:15.914548 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:15Z","lastTransitionTime":"2025-12-09T11:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.016553 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.016879 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.016961 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.017035 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.017090 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:16Z","lastTransitionTime":"2025-12-09T11:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.118965 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.119008 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.119018 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.119032 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.119041 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:16Z","lastTransitionTime":"2025-12-09T11:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.221208 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.221250 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.221263 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.221277 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.221286 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:16Z","lastTransitionTime":"2025-12-09T11:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.323962 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.323996 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.324005 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.324020 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.324030 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:16Z","lastTransitionTime":"2025-12-09T11:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.334276 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.334401 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.334426 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:16 crc kubenswrapper[4745]: E1209 11:33:16.334493 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.334464574 +0000 UTC m=+147.159666158 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:33:16 crc kubenswrapper[4745]: E1209 11:33:16.334531 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 11:33:16 crc kubenswrapper[4745]: E1209 11:33:16.334546 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 11:33:16 crc kubenswrapper[4745]: E1209 11:33:16.334557 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.334572 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:16 crc kubenswrapper[4745]: E1209 11:33:16.334595 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.334582837 +0000 UTC m=+147.159784361 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.334612 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:16 crc kubenswrapper[4745]: E1209 11:33:16.334674 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 11:33:16 crc kubenswrapper[4745]: E1209 11:33:16.334700 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.33469461 +0000 UTC m=+147.159896134 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 11:33:16 crc kubenswrapper[4745]: E1209 11:33:16.334702 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 11:33:16 crc kubenswrapper[4745]: E1209 11:33:16.334721 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 11:33:16 crc kubenswrapper[4745]: E1209 11:33:16.334732 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:33:16 crc kubenswrapper[4745]: E1209 11:33:16.334773 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.334766262 +0000 UTC m=+147.159967776 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 11:33:16 crc kubenswrapper[4745]: E1209 11:33:16.335057 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 11:33:16 crc kubenswrapper[4745]: E1209 11:33:16.335156 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.335146283 +0000 UTC m=+147.160347807 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.426695 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.426913 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.427028 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.427126 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.427206 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:16Z","lastTransitionTime":"2025-12-09T11:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.529536 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.529755 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.529877 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.529997 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.530095 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:16Z","lastTransitionTime":"2025-12-09T11:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.554352 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.554360 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.554403 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:16 crc kubenswrapper[4745]: E1209 11:33:16.554701 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:16 crc kubenswrapper[4745]: E1209 11:33:16.554718 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:16 crc kubenswrapper[4745]: E1209 11:33:16.554942 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.635226 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.635863 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.635907 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.635926 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.635938 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:16Z","lastTransitionTime":"2025-12-09T11:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.738915 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.738953 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.738963 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.738976 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.738984 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:16Z","lastTransitionTime":"2025-12-09T11:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.841260 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.841296 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.841310 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.841324 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.841334 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:16Z","lastTransitionTime":"2025-12-09T11:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.943024 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.943055 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.943066 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.943081 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:16 crc kubenswrapper[4745]: I1209 11:33:16.943094 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:16Z","lastTransitionTime":"2025-12-09T11:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.045655 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.045700 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.045712 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.045731 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.045743 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:17Z","lastTransitionTime":"2025-12-09T11:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.148172 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.148203 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.148210 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.148225 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.148233 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:17Z","lastTransitionTime":"2025-12-09T11:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.250311 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.250347 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.250355 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.250370 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.250382 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:17Z","lastTransitionTime":"2025-12-09T11:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.353181 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.353215 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.353224 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.353237 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.353248 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:17Z","lastTransitionTime":"2025-12-09T11:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.456432 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.456471 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.456479 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.456494 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.456503 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:17Z","lastTransitionTime":"2025-12-09T11:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.554829 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:17 crc kubenswrapper[4745]: E1209 11:33:17.555246 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.559194 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.559232 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.559245 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.559260 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.559268 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:17Z","lastTransitionTime":"2025-12-09T11:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.662208 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.662535 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.662544 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.662558 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.662568 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:17Z","lastTransitionTime":"2025-12-09T11:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.765465 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.765573 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.765587 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.765616 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.765632 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:17Z","lastTransitionTime":"2025-12-09T11:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.868113 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.868170 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.868182 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.868201 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.868213 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:17Z","lastTransitionTime":"2025-12-09T11:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.970712 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.970755 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.970766 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.970783 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:17 crc kubenswrapper[4745]: I1209 11:33:17.970795 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:17Z","lastTransitionTime":"2025-12-09T11:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.072713 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.072766 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.072777 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.072794 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.072808 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:18Z","lastTransitionTime":"2025-12-09T11:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.175583 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.175617 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.175625 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.175639 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.175647 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:18Z","lastTransitionTime":"2025-12-09T11:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.274774 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.274841 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.274863 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.274893 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.274913 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:18Z","lastTransitionTime":"2025-12-09T11:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:18 crc kubenswrapper[4745]: E1209 11:33:18.292703 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.297627 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.297672 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.297684 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.297702 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.297719 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:18Z","lastTransitionTime":"2025-12-09T11:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:18 crc kubenswrapper[4745]: E1209 11:33:18.309810 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.316589 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.316670 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.316689 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.316717 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.316743 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:18Z","lastTransitionTime":"2025-12-09T11:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:18 crc kubenswrapper[4745]: E1209 11:33:18.331908 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.335923 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.335963 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.335974 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.335991 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.336001 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:18Z","lastTransitionTime":"2025-12-09T11:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:18 crc kubenswrapper[4745]: E1209 11:33:18.352722 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.357065 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.357104 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.357113 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.357130 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.357140 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:18Z","lastTransitionTime":"2025-12-09T11:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:18 crc kubenswrapper[4745]: E1209 11:33:18.369621 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9cf46efc-ff1e-4018-a4c0-ef197e8adebf\\\",\\\"systemUUID\\\":\\\"96c45dda-f1d0-4fb2-b98c-edc8f5390e21\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:18Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:18 crc kubenswrapper[4745]: E1209 11:33:18.369847 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.371555 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.371610 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.371629 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.371653 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.371671 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:18Z","lastTransitionTime":"2025-12-09T11:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.474049 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.474112 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.474135 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.474161 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.474182 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:18Z","lastTransitionTime":"2025-12-09T11:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.554049 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.554128 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:18 crc kubenswrapper[4745]: E1209 11:33:18.554187 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:18 crc kubenswrapper[4745]: E1209 11:33:18.554271 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.554057 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:18 crc kubenswrapper[4745]: E1209 11:33:18.554375 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.576902 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.576960 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.576971 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.576987 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.576997 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:18Z","lastTransitionTime":"2025-12-09T11:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.679150 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.679215 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.679223 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.679236 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.679245 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:18Z","lastTransitionTime":"2025-12-09T11:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.781248 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.781283 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.781291 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.781304 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.781313 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:18Z","lastTransitionTime":"2025-12-09T11:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.883105 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.883151 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.883164 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.883180 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.883192 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:18Z","lastTransitionTime":"2025-12-09T11:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.985891 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.985955 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.985967 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.985981 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:18 crc kubenswrapper[4745]: I1209 11:33:18.985990 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:18Z","lastTransitionTime":"2025-12-09T11:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.088392 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.088465 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.088497 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.088569 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.088592 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:19Z","lastTransitionTime":"2025-12-09T11:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.191851 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.191904 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.191924 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.191947 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.191989 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:19Z","lastTransitionTime":"2025-12-09T11:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.294495 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.294581 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.294593 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.294611 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.294624 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:19Z","lastTransitionTime":"2025-12-09T11:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.396798 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.396857 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.396871 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.396892 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.396907 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:19Z","lastTransitionTime":"2025-12-09T11:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.499296 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.499336 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.499373 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.499386 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.499396 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:19Z","lastTransitionTime":"2025-12-09T11:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.554024 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:19 crc kubenswrapper[4745]: E1209 11:33:19.554280 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.601675 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.601741 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.601755 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.601779 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.601793 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:19Z","lastTransitionTime":"2025-12-09T11:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.704323 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.704374 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.704393 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.704414 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.704429 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:19Z","lastTransitionTime":"2025-12-09T11:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.806695 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.806746 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.806759 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.806778 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.806792 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:19Z","lastTransitionTime":"2025-12-09T11:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.909479 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.909559 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.909571 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.909591 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:19 crc kubenswrapper[4745]: I1209 11:33:19.909604 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:19Z","lastTransitionTime":"2025-12-09T11:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.011926 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.011968 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.011977 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.011993 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.012005 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:20Z","lastTransitionTime":"2025-12-09T11:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.113648 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.113683 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.113693 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.113707 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.113719 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:20Z","lastTransitionTime":"2025-12-09T11:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.215935 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.215979 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.215991 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.216006 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.216018 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:20Z","lastTransitionTime":"2025-12-09T11:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.318127 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.318165 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.318175 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.318189 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.318202 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:20Z","lastTransitionTime":"2025-12-09T11:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.420953 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.420998 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.421009 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.421049 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.421063 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:20Z","lastTransitionTime":"2025-12-09T11:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.523173 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.523240 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.523262 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.523332 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.523354 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:20Z","lastTransitionTime":"2025-12-09T11:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.554663 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.554694 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.554662 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:20 crc kubenswrapper[4745]: E1209 11:33:20.554782 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:20 crc kubenswrapper[4745]: E1209 11:33:20.554866 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:20 crc kubenswrapper[4745]: E1209 11:33:20.554918 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.626326 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.626373 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.626385 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.626400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.626415 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:20Z","lastTransitionTime":"2025-12-09T11:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.728404 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.728446 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.728454 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.728467 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.728475 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:20Z","lastTransitionTime":"2025-12-09T11:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.831127 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.831164 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.831173 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.831185 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.831194 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:20Z","lastTransitionTime":"2025-12-09T11:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.933215 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.933257 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.933268 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.933285 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:20 crc kubenswrapper[4745]: I1209 11:33:20.933297 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:20Z","lastTransitionTime":"2025-12-09T11:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.035814 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.035858 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.035874 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.035895 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.035911 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:21Z","lastTransitionTime":"2025-12-09T11:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.139814 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.139884 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.139906 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.139938 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.139961 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:21Z","lastTransitionTime":"2025-12-09T11:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.243115 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.243167 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.243184 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.243206 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.243222 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:21Z","lastTransitionTime":"2025-12-09T11:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.345960 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.346020 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.346045 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.346078 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.346102 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:21Z","lastTransitionTime":"2025-12-09T11:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.448577 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.448621 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.448634 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.448651 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.448663 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:21Z","lastTransitionTime":"2025-12-09T11:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.551663 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.551726 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.551742 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.551771 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.551788 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:21Z","lastTransitionTime":"2025-12-09T11:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.554959 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:21 crc kubenswrapper[4745]: E1209 11:33:21.556200 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.653989 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.654021 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.654029 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.654042 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.654050 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:21Z","lastTransitionTime":"2025-12-09T11:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.756766 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.756826 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.756842 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.756865 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.756881 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:21Z","lastTransitionTime":"2025-12-09T11:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.860020 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.860064 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.860072 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.860085 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.860095 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:21Z","lastTransitionTime":"2025-12-09T11:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.963058 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.963087 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.963096 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.963108 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:21 crc kubenswrapper[4745]: I1209 11:33:21.963116 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:21Z","lastTransitionTime":"2025-12-09T11:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.065596 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.065663 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.065681 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.065703 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.065719 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:22Z","lastTransitionTime":"2025-12-09T11:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.167860 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.167911 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.167923 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.167941 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.167954 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:22Z","lastTransitionTime":"2025-12-09T11:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.270088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.270135 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.270146 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.270159 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.270167 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:22Z","lastTransitionTime":"2025-12-09T11:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.372670 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.372700 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.372711 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.372725 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.372736 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:22Z","lastTransitionTime":"2025-12-09T11:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.474559 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.474595 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.474603 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.474618 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.474627 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:22Z","lastTransitionTime":"2025-12-09T11:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.554835 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:22 crc kubenswrapper[4745]: E1209 11:33:22.555010 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.555351 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:22 crc kubenswrapper[4745]: E1209 11:33:22.555499 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.555739 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:22 crc kubenswrapper[4745]: E1209 11:33:22.555866 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.576956 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.577021 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.577036 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.577052 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.577064 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:22Z","lastTransitionTime":"2025-12-09T11:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.680014 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.680084 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.680100 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.680122 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.680138 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:22Z","lastTransitionTime":"2025-12-09T11:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.783003 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.783051 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.783060 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.783075 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.783084 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:22Z","lastTransitionTime":"2025-12-09T11:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.885245 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.885288 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.885296 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.885345 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.885354 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:22Z","lastTransitionTime":"2025-12-09T11:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.987531 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.987565 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.987576 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.987589 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:22 crc kubenswrapper[4745]: I1209 11:33:22.987600 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:22Z","lastTransitionTime":"2025-12-09T11:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.090278 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.090324 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.090338 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.090355 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.090367 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:23Z","lastTransitionTime":"2025-12-09T11:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.193034 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.193064 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.193073 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.193088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.193098 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:23Z","lastTransitionTime":"2025-12-09T11:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.295774 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.295836 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.295847 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.295864 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.295882 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:23Z","lastTransitionTime":"2025-12-09T11:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.398818 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.398854 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.398864 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.398877 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.398887 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:23Z","lastTransitionTime":"2025-12-09T11:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.500906 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.500949 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.500960 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.500974 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.500983 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:23Z","lastTransitionTime":"2025-12-09T11:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.554426 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:23 crc kubenswrapper[4745]: E1209 11:33:23.554597 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.567552 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.580535 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.595357 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dc9202-9b7e-4a17-a80f-db9338f17cd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba480fec57059de114fd08691c54bf08b36f5a6c4ba8547f3b7f584d210c659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bc7sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.604310 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.604379 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.604400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.604431 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.604454 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:23Z","lastTransitionTime":"2025-12-09T11:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.612731 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55eda0c3-9e07-4911-a739-5cb7b52f8ff6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4063a7b7f8deee7c7899e5aa1896e8488f8bf92467369aaf18a6dda6f913e7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3047c6676802023e4411b4f2d2ccfd929edc59c00125b8ad91de589e2a8622dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3047c6676802023e4411b4f2d2ccfd929edc59c00125b8ad91de589e2a8622dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.627650 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b28d5a6b222b1fee60e2e3765b5b9df1b69dcd15963f5cd11c427c444647be3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8e13776a1343bb199b5797d375fd112b153c98f90e51cb1437d6d966b81952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.639545 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r6gmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1002b34d-f671-4b20-bf4f-492ce3295cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e88ce3abcb49b7d6904538b5b2ce3a0a151159de54b316e2824c860f29b18a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:33:00Z\\\",\\\"message\\\":\\\"2025-12-09T11:32:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_211073af-0d9e-4546-9f6c-2d85b8bfb2bc\\\\n2025-12-09T11:32:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_211073af-0d9e-4546-9f6c-2d85b8bfb2bc to /host/opt/cni/bin/\\\\n2025-12-09T11:32:15Z [verbose] multus-daemon started\\\\n2025-12-09T11:32:15Z [verbose] Readiness Indicator file check\\\\n2025-12-09T11:33:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r6gmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.652820 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d694900-c0af-4c93-be86-8b9e4467d152\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce81b451ebf93d7399060678c34081ce80ae7fce5ce110ba35ff79c70d227e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5f42b6fd205fe2928f3207e77e115a4c043df1b9f47985edc72802b3e3e0f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7v9zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cxwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.667724 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea6befdd-80ca-42c2-813f-62a5cdff9605\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9gjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jdv4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.682382 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114fee68-2acc-4945-b383-7977af176d00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e59d58402068bbee01f7396df192a07bd69d0554247115c60f7c2240c05b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09a2835e48af702feb09a0b23c5f3ba8ce254dc94b7f20af7ab9930bc28597f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39fcc19db06379545635cb78392ba64abf0087e6ef64456643a365dc4c08f4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.696978 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2nxln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a087421f-143c-4f67-b2a3-38d27e5805ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0621bb5222ba3c9b2b26ec57cbd9507a5079d0368f48c46daa2098d307e3e634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57907e0cb97df4108554b6bcafc742cab94d83744c6297bf208fbf4998ef5689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f5ddd36a8f53d9a5097548fd52de87d37686e25b4d11db703d6cc097d755e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3f3c19b4326753750c283fe37a5128fd16abc60bb72e5997c0c5e0d54e22747\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad1c3c9aec8450b96d95ae214f011082b197d2a34f0a000a7473f039a789c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e37b071122437334dae4deee5f3603b85ac855d0e0211c255a21a2cad42eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe2d0de8d1e32c8509e9c72d6a1a29250e60feeed7da26f3fa7f5ad64b0273f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2nxln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.707228 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.707274 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.707285 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.707301 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.707314 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:23Z","lastTransitionTime":"2025-12-09T11:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.708402 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2nrjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34be8f45-03bf-4f76-96d6-b1b8c66f41b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e9a30745209ad2f2e8c2fde926ca0cf418801f4aff0a98dd24226f686da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mbq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2nrjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.728585 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac484d76-f5da-4880-868d-1e2e5289c025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T11:33:09Z\\\",\\\"message\\\":\\\"33:09.217592 6773 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1209 11:33:09.217583 6773 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-vwrlh\\\\nI1209 11:33:09.217601 6773 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1209 11:33:09.217595 6773 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-jdv4j before timer (time: 2025-12-09 11:33:10.214550815 +0000 UTC m=+1.505266697): skip\\\\nI1209 11:33:09.217609 6773 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI1209 11:33:09.217617 6773 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1209 11:33:09.217624 6773 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1209 11:33:09.217638 6773 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 166.225µs)\\\\nI1209 11:33:09.217586 6773 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1209 11:33:09.217723 6773 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:33:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:32:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:32:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dwc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vwrlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.744617 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrwjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9f8d51-22d8-4f69-b81f-9ee4ae5bc8ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa1e8982b92e4d984551a7c346c73041026b6c80f583968ed5b53c83c40ae569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5psll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:32:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrwjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.766247 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.786653 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0e95e9-72b9-4928-bb37-a6761c77500c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T11:32:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 11:32:07.104013 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 11:32:07.105460 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1963888301/tls.crt::/tmp/serving-cert-1963888301/tls.key\\\\\\\"\\\\nI1209 11:32:12.653584 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 11:32:12.655491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 11:32:12.655526 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 11:32:12.655557 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 11:32:12.655566 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 11:32:12.660254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 11:32:12.660323 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 11:32:12.660373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 11:32:12.660393 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 11:32:12.660413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 11:32:12.660433 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 11:32:12.660277 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 11:32:12.662196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.813802 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39b6220f-fc17-4df5-9a6e-2a39055c1715\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96728d2c3cf7756ee0c56aef45fd0a2b64a6baea8e3a92cf8c615225db90a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://128879059f618fe71b194187d768c3a7471fed1c5f82d1c5155695c7f1eacd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d675827d644a54e0ef196be479043e4ed251387e00c279353dc82f3cfc379c50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bb008e6dc78de5d4312a3fd37fd4ef6bd6c4449be90e03bdb20204d0cf09297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.822770 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.822804 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.822814 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.822829 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.822840 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:23Z","lastTransitionTime":"2025-12-09T11:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.848606 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dba930ee2d80b54c31c0294ea75a265a55c400eab9a645dee16d147788d4f5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.859728 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc611c650997ea694827043e95281113d5c0745279c9cf26a5094f8ca6f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:32:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.878390 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8656c090-abd6-486f-880b-8903460f8f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:32:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T11:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a767718cc63851cef650e6e1015fcb5eb749215db83ed187253aa323ba78bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc223ae1109b5cb6f754814d927161af98b04abc363bb6bd4604ff7a88c507d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4daf8e338986f6ec26ec8312ccc0bd1d530a24ea319ac407b778a4338036755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877f880e933eea6c6ecf631b3eeaf6a28252cc0e0906c812240412f6cbbfd809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6d29f5f81e200dac143c33341c6103a781c83d71a1988bf22534ffabf104b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T11:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e550c0846f1df56a570e98c3e4dc8f672e31cdba87c78acaa9fdc4e457c0878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd99624a59d65a8a3b16624bb5f07ac90b6f062f4877a224914608bade112749\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9df6739b87d45a1e58f655190726dc1455686c187d37e01d5850744ca060911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T11:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T11:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T11:31:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T11:33:23Z is after 2025-08-24T17:21:41Z" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.925706 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.925767 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.925783 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.925807 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:23 crc kubenswrapper[4745]: I1209 11:33:23.925836 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:23Z","lastTransitionTime":"2025-12-09T11:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.028231 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.028261 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.028272 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.028288 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.028319 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:24Z","lastTransitionTime":"2025-12-09T11:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.131142 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.131182 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.131191 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.131204 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.131213 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:24Z","lastTransitionTime":"2025-12-09T11:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.234321 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.234392 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.234426 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.234443 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.234460 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:24Z","lastTransitionTime":"2025-12-09T11:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.336667 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.336707 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.336716 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.336733 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.336745 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:24Z","lastTransitionTime":"2025-12-09T11:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.439197 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.439223 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.439232 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.439244 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.439252 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:24Z","lastTransitionTime":"2025-12-09T11:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.541928 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.541963 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.541973 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.541987 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.541997 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:24Z","lastTransitionTime":"2025-12-09T11:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.554390 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:24 crc kubenswrapper[4745]: E1209 11:33:24.554489 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.554390 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.554396 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:24 crc kubenswrapper[4745]: E1209 11:33:24.554612 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:24 crc kubenswrapper[4745]: E1209 11:33:24.554677 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.645560 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.645597 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.645609 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.645627 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.645637 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:24Z","lastTransitionTime":"2025-12-09T11:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.748363 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.748394 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.748401 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.748413 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.748421 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:24Z","lastTransitionTime":"2025-12-09T11:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.850683 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.850719 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.850728 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.850741 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.850752 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:24Z","lastTransitionTime":"2025-12-09T11:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.953339 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.953393 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.953409 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.953432 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:24 crc kubenswrapper[4745]: I1209 11:33:24.953449 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:24Z","lastTransitionTime":"2025-12-09T11:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.056796 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.056851 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.056864 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.056881 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.056893 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:25Z","lastTransitionTime":"2025-12-09T11:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.159351 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.159386 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.159403 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.159418 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.159427 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:25Z","lastTransitionTime":"2025-12-09T11:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.261457 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.261524 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.261538 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.261555 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.261566 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:25Z","lastTransitionTime":"2025-12-09T11:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.363486 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.363560 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.363575 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.363591 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.363602 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:25Z","lastTransitionTime":"2025-12-09T11:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.465615 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.465666 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.465679 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.465698 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.465711 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:25Z","lastTransitionTime":"2025-12-09T11:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.554125 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:25 crc kubenswrapper[4745]: E1209 11:33:25.554607 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.554842 4745 scope.go:117] "RemoveContainer" containerID="9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4" Dec 09 11:33:25 crc kubenswrapper[4745]: E1209 11:33:25.555004 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.567115 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.567142 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.567153 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.567166 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.567177 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:25Z","lastTransitionTime":"2025-12-09T11:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.668925 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.668961 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.668970 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.668982 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.668991 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:25Z","lastTransitionTime":"2025-12-09T11:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.771443 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.771476 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.771487 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.771502 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.771530 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:25Z","lastTransitionTime":"2025-12-09T11:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.874132 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.874170 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.874180 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.874194 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.874204 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:25Z","lastTransitionTime":"2025-12-09T11:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.976536 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.976593 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.976612 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.976636 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:25 crc kubenswrapper[4745]: I1209 11:33:25.976653 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:25Z","lastTransitionTime":"2025-12-09T11:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.079054 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.079103 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.079114 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.079130 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.079143 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:26Z","lastTransitionTime":"2025-12-09T11:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.181569 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.181653 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.181668 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.181685 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.181698 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:26Z","lastTransitionTime":"2025-12-09T11:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.283885 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.283911 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.283919 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.283932 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.283940 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:26Z","lastTransitionTime":"2025-12-09T11:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.386394 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.386445 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.386457 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.386475 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.386489 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:26Z","lastTransitionTime":"2025-12-09T11:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.488906 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.488950 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.488961 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.488977 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.488987 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:26Z","lastTransitionTime":"2025-12-09T11:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.554161 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.554192 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.554165 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:26 crc kubenswrapper[4745]: E1209 11:33:26.554368 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:26 crc kubenswrapper[4745]: E1209 11:33:26.554470 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:26 crc kubenswrapper[4745]: E1209 11:33:26.554587 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.590767 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.590811 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.590825 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.590842 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.590854 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:26Z","lastTransitionTime":"2025-12-09T11:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.694843 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.694904 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.694956 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.694981 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.695002 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:26Z","lastTransitionTime":"2025-12-09T11:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.798122 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.798182 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.798195 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.798227 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.798249 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:26Z","lastTransitionTime":"2025-12-09T11:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.901037 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.901129 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.901149 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.901184 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:26 crc kubenswrapper[4745]: I1209 11:33:26.901212 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:26Z","lastTransitionTime":"2025-12-09T11:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.004126 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.004174 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.004192 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.004217 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.004233 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:27Z","lastTransitionTime":"2025-12-09T11:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.106638 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.106682 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.106693 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.106708 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.106719 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:27Z","lastTransitionTime":"2025-12-09T11:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.209485 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.209549 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.209560 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.209575 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.209586 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:27Z","lastTransitionTime":"2025-12-09T11:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.312143 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.312176 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.312184 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.312196 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.312205 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:27Z","lastTransitionTime":"2025-12-09T11:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.414336 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.414378 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.414385 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.414399 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.414409 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:27Z","lastTransitionTime":"2025-12-09T11:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.516773 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.516824 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.516834 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.516848 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.516860 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:27Z","lastTransitionTime":"2025-12-09T11:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.553902 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:27 crc kubenswrapper[4745]: E1209 11:33:27.554039 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.619425 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.619487 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.619501 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.619542 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.619554 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:27Z","lastTransitionTime":"2025-12-09T11:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.722247 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.722289 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.722302 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.722359 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.722371 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:27Z","lastTransitionTime":"2025-12-09T11:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.824790 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.824834 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.824845 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.824860 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.824872 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:27Z","lastTransitionTime":"2025-12-09T11:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.927717 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.927815 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.927839 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.927867 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:27 crc kubenswrapper[4745]: I1209 11:33:27.927888 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:27Z","lastTransitionTime":"2025-12-09T11:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.030667 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.030715 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.030723 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.030737 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.030748 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:28Z","lastTransitionTime":"2025-12-09T11:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.134104 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.134189 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.134214 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.134251 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.134276 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:28Z","lastTransitionTime":"2025-12-09T11:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.237428 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.237473 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.237485 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.237532 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.237545 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:28Z","lastTransitionTime":"2025-12-09T11:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.340565 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.340655 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.340674 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.340694 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.340708 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:28Z","lastTransitionTime":"2025-12-09T11:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.443319 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.443367 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.443378 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.443396 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.443410 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:28Z","lastTransitionTime":"2025-12-09T11:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.545807 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.545847 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.545858 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.545871 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.545881 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:28Z","lastTransitionTime":"2025-12-09T11:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.554100 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.554165 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:28 crc kubenswrapper[4745]: E1209 11:33:28.554204 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.554224 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:28 crc kubenswrapper[4745]: E1209 11:33:28.554425 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:28 crc kubenswrapper[4745]: E1209 11:33:28.554493 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.638202 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.638239 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.638265 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.638282 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.638290 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T11:33:28Z","lastTransitionTime":"2025-12-09T11:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.680230 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m"] Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.681531 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.684681 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.684703 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.684778 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.684813 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.703894 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=75.703876984 podStartE2EDuration="1m15.703876984s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:33:28.703644117 +0000 UTC m=+95.528845661" watchObservedRunningTime="2025-12-09 11:33:28.703876984 +0000 UTC m=+95.529078508" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.715813 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.715795755 podStartE2EDuration="42.715795755s" podCreationTimestamp="2025-12-09 11:32:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:33:28.715551498 +0000 UTC m=+95.540753032" watchObservedRunningTime="2025-12-09 11:33:28.715795755 +0000 UTC m=+95.540997279" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.785356 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04680df1-dfd6-4d2f-939a-cbea98c4e30a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m966m\" (UID: \"04680df1-dfd6-4d2f-939a-cbea98c4e30a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.785394 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/04680df1-dfd6-4d2f-939a-cbea98c4e30a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m966m\" (UID: \"04680df1-dfd6-4d2f-939a-cbea98c4e30a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.785424 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/04680df1-dfd6-4d2f-939a-cbea98c4e30a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m966m\" (UID: \"04680df1-dfd6-4d2f-939a-cbea98c4e30a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.785641 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04680df1-dfd6-4d2f-939a-cbea98c4e30a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m966m\" (UID: \"04680df1-dfd6-4d2f-939a-cbea98c4e30a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.785697 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04680df1-dfd6-4d2f-939a-cbea98c4e30a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m966m\" (UID: \"04680df1-dfd6-4d2f-939a-cbea98c4e30a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.800969 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=74.800947671 podStartE2EDuration="1m14.800947671s" podCreationTimestamp="2025-12-09 11:32:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:33:28.788843545 +0000 UTC m=+95.614045069" watchObservedRunningTime="2025-12-09 11:33:28.800947671 +0000 UTC m=+95.626149195" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.844328 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podStartSLOduration=75.844292545 podStartE2EDuration="1m15.844292545s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:33:28.833745782 +0000 UTC m=+95.658947306" watchObservedRunningTime="2025-12-09 11:33:28.844292545 +0000 UTC m=+95.669494109" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.844920 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=15.844912463 podStartE2EDuration="15.844912463s" podCreationTimestamp="2025-12-09 11:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:33:28.843825092 +0000 UTC m=+95.669026616" watchObservedRunningTime="2025-12-09 11:33:28.844912463 +0000 UTC m=+95.670114017" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.871373 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-r6gmj" podStartSLOduration=75.871345727 podStartE2EDuration="1m15.871345727s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:33:28.870928195 +0000 UTC m=+95.696129719" watchObservedRunningTime="2025-12-09 11:33:28.871345727 +0000 UTC m=+95.696547251" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.883681 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cxwgd" podStartSLOduration=75.883661479 podStartE2EDuration="1m15.883661479s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:33:28.883084183 +0000 UTC m=+95.708285707" watchObservedRunningTime="2025-12-09 11:33:28.883661479 +0000 UTC m=+95.708863003" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.886087 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04680df1-dfd6-4d2f-939a-cbea98c4e30a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m966m\" (UID: \"04680df1-dfd6-4d2f-939a-cbea98c4e30a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.886115 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/04680df1-dfd6-4d2f-939a-cbea98c4e30a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m966m\" (UID: \"04680df1-dfd6-4d2f-939a-cbea98c4e30a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.886139 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/04680df1-dfd6-4d2f-939a-cbea98c4e30a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m966m\" (UID: \"04680df1-dfd6-4d2f-939a-cbea98c4e30a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.886188 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04680df1-dfd6-4d2f-939a-cbea98c4e30a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m966m\" (UID: \"04680df1-dfd6-4d2f-939a-cbea98c4e30a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.886203 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04680df1-dfd6-4d2f-939a-cbea98c4e30a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m966m\" (UID: \"04680df1-dfd6-4d2f-939a-cbea98c4e30a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.886366 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/04680df1-dfd6-4d2f-939a-cbea98c4e30a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m966m\" (UID: \"04680df1-dfd6-4d2f-939a-cbea98c4e30a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.886413 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/04680df1-dfd6-4d2f-939a-cbea98c4e30a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m966m\" (UID: \"04680df1-dfd6-4d2f-939a-cbea98c4e30a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.887017 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04680df1-dfd6-4d2f-939a-cbea98c4e30a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m966m\" (UID: \"04680df1-dfd6-4d2f-939a-cbea98c4e30a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.897591 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04680df1-dfd6-4d2f-939a-cbea98c4e30a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m966m\" (UID: \"04680df1-dfd6-4d2f-939a-cbea98c4e30a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.904018 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04680df1-dfd6-4d2f-939a-cbea98c4e30a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m966m\" (UID: \"04680df1-dfd6-4d2f-939a-cbea98c4e30a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.933195 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2nxln" podStartSLOduration=75.933173005 podStartE2EDuration="1m15.933173005s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:33:28.933174735 +0000 UTC m=+95.758376259" watchObservedRunningTime="2025-12-09 11:33:28.933173005 +0000 UTC m=+95.758374529" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.933940 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=76.933933166 podStartE2EDuration="1m16.933933166s" podCreationTimestamp="2025-12-09 11:32:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:33:28.91606177 +0000 UTC m=+95.741263294" watchObservedRunningTime="2025-12-09 11:33:28.933933166 +0000 UTC m=+95.759134690" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.945110 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2nrjq" podStartSLOduration=75.945077876 podStartE2EDuration="1m15.945077876s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:33:28.943934904 +0000 UTC m=+95.769136418" watchObservedRunningTime="2025-12-09 11:33:28.945077876 +0000 UTC m=+95.770279410" Dec 09 11:33:28 crc kubenswrapper[4745]: I1209 11:33:28.983024 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zrwjr" podStartSLOduration=75.982999729 podStartE2EDuration="1m15.982999729s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:33:28.98266842 +0000 UTC m=+95.807869944" watchObservedRunningTime="2025-12-09 11:33:28.982999729 +0000 UTC m=+95.808201243" Dec 09 11:33:29 crc kubenswrapper[4745]: I1209 11:33:29.005552 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" Dec 09 11:33:29 crc kubenswrapper[4745]: W1209 11:33:29.022337 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04680df1_dfd6_4d2f_939a_cbea98c4e30a.slice/crio-d403859cad56e6b0d2de54ea28433bfda2a591771cbebea8f0933c9b2c4ad0fe WatchSource:0}: Error finding container d403859cad56e6b0d2de54ea28433bfda2a591771cbebea8f0933c9b2c4ad0fe: Status 404 returned error can't find the container with id d403859cad56e6b0d2de54ea28433bfda2a591771cbebea8f0933c9b2c4ad0fe Dec 09 11:33:29 crc kubenswrapper[4745]: I1209 11:33:29.048176 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" event={"ID":"04680df1-dfd6-4d2f-939a-cbea98c4e30a","Type":"ContainerStarted","Data":"d403859cad56e6b0d2de54ea28433bfda2a591771cbebea8f0933c9b2c4ad0fe"} Dec 09 11:33:29 crc kubenswrapper[4745]: I1209 11:33:29.554632 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:29 crc kubenswrapper[4745]: E1209 11:33:29.555036 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:30 crc kubenswrapper[4745]: I1209 11:33:30.052811 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" event={"ID":"04680df1-dfd6-4d2f-939a-cbea98c4e30a","Type":"ContainerStarted","Data":"a7abc397c4a471b03021946831696b93daef642841abb6e5d9f5ef54ab1ebd71"} Dec 09 11:33:30 crc kubenswrapper[4745]: I1209 11:33:30.554276 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:30 crc kubenswrapper[4745]: I1209 11:33:30.554322 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:30 crc kubenswrapper[4745]: E1209 11:33:30.554429 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:30 crc kubenswrapper[4745]: I1209 11:33:30.554558 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:30 crc kubenswrapper[4745]: E1209 11:33:30.554785 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:30 crc kubenswrapper[4745]: E1209 11:33:30.554974 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:31 crc kubenswrapper[4745]: I1209 11:33:31.412216 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs\") pod \"network-metrics-daemon-jdv4j\" (UID: \"ea6befdd-80ca-42c2-813f-62a5cdff9605\") " pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:31 crc kubenswrapper[4745]: E1209 11:33:31.412410 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 11:33:31 crc kubenswrapper[4745]: E1209 11:33:31.412502 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs podName:ea6befdd-80ca-42c2-813f-62a5cdff9605 nodeName:}" failed. No retries permitted until 2025-12-09 11:34:35.412474601 +0000 UTC m=+162.237676125 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs") pod "network-metrics-daemon-jdv4j" (UID: "ea6befdd-80ca-42c2-813f-62a5cdff9605") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 11:33:31 crc kubenswrapper[4745]: I1209 11:33:31.554039 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:31 crc kubenswrapper[4745]: E1209 11:33:31.554239 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:32 crc kubenswrapper[4745]: I1209 11:33:32.553925 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:32 crc kubenswrapper[4745]: I1209 11:33:32.553969 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:32 crc kubenswrapper[4745]: I1209 11:33:32.553925 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:32 crc kubenswrapper[4745]: E1209 11:33:32.554067 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:32 crc kubenswrapper[4745]: E1209 11:33:32.554228 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:32 crc kubenswrapper[4745]: E1209 11:33:32.554351 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:33 crc kubenswrapper[4745]: I1209 11:33:33.555991 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:33 crc kubenswrapper[4745]: E1209 11:33:33.556981 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:34 crc kubenswrapper[4745]: I1209 11:33:34.554175 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:34 crc kubenswrapper[4745]: I1209 11:33:34.554312 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:34 crc kubenswrapper[4745]: E1209 11:33:34.554337 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:34 crc kubenswrapper[4745]: I1209 11:33:34.554204 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:34 crc kubenswrapper[4745]: E1209 11:33:34.554586 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:34 crc kubenswrapper[4745]: E1209 11:33:34.554670 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:35 crc kubenswrapper[4745]: I1209 11:33:35.554406 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:35 crc kubenswrapper[4745]: E1209 11:33:35.554570 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:36 crc kubenswrapper[4745]: I1209 11:33:36.554254 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:36 crc kubenswrapper[4745]: I1209 11:33:36.554332 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:36 crc kubenswrapper[4745]: I1209 11:33:36.554386 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:36 crc kubenswrapper[4745]: E1209 11:33:36.554397 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:36 crc kubenswrapper[4745]: E1209 11:33:36.554764 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:36 crc kubenswrapper[4745]: E1209 11:33:36.554676 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:37 crc kubenswrapper[4745]: I1209 11:33:37.555841 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:37 crc kubenswrapper[4745]: E1209 11:33:37.556204 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:37 crc kubenswrapper[4745]: I1209 11:33:37.556692 4745 scope.go:117] "RemoveContainer" containerID="9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4" Dec 09 11:33:37 crc kubenswrapper[4745]: E1209 11:33:37.557026 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" Dec 09 11:33:38 crc kubenswrapper[4745]: I1209 11:33:38.553831 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:38 crc kubenswrapper[4745]: I1209 11:33:38.553876 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:38 crc kubenswrapper[4745]: I1209 11:33:38.554073 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:38 crc kubenswrapper[4745]: E1209 11:33:38.554162 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:38 crc kubenswrapper[4745]: E1209 11:33:38.554324 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:38 crc kubenswrapper[4745]: E1209 11:33:38.554394 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:39 crc kubenswrapper[4745]: I1209 11:33:39.555016 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:39 crc kubenswrapper[4745]: E1209 11:33:39.555219 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:40 crc kubenswrapper[4745]: I1209 11:33:40.554722 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:40 crc kubenswrapper[4745]: I1209 11:33:40.554774 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:40 crc kubenswrapper[4745]: I1209 11:33:40.554863 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:40 crc kubenswrapper[4745]: E1209 11:33:40.554936 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:40 crc kubenswrapper[4745]: E1209 11:33:40.555014 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:40 crc kubenswrapper[4745]: E1209 11:33:40.555095 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:41 crc kubenswrapper[4745]: I1209 11:33:41.554861 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:41 crc kubenswrapper[4745]: E1209 11:33:41.555044 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:42 crc kubenswrapper[4745]: I1209 11:33:42.554440 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:42 crc kubenswrapper[4745]: I1209 11:33:42.554491 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:42 crc kubenswrapper[4745]: I1209 11:33:42.554446 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:42 crc kubenswrapper[4745]: E1209 11:33:42.554599 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:42 crc kubenswrapper[4745]: E1209 11:33:42.554733 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:42 crc kubenswrapper[4745]: E1209 11:33:42.554826 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:43 crc kubenswrapper[4745]: I1209 11:33:43.554306 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:43 crc kubenswrapper[4745]: E1209 11:33:43.555604 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:44 crc kubenswrapper[4745]: I1209 11:33:44.554555 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:44 crc kubenswrapper[4745]: I1209 11:33:44.554627 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:44 crc kubenswrapper[4745]: E1209 11:33:44.554705 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:44 crc kubenswrapper[4745]: I1209 11:33:44.554640 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:44 crc kubenswrapper[4745]: E1209 11:33:44.554784 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:44 crc kubenswrapper[4745]: E1209 11:33:44.554935 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:45 crc kubenswrapper[4745]: I1209 11:33:45.553865 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:45 crc kubenswrapper[4745]: E1209 11:33:45.553993 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:46 crc kubenswrapper[4745]: I1209 11:33:46.554339 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:46 crc kubenswrapper[4745]: I1209 11:33:46.554364 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:46 crc kubenswrapper[4745]: I1209 11:33:46.554364 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:46 crc kubenswrapper[4745]: E1209 11:33:46.554483 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:46 crc kubenswrapper[4745]: E1209 11:33:46.554560 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:46 crc kubenswrapper[4745]: E1209 11:33:46.554611 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:47 crc kubenswrapper[4745]: I1209 11:33:47.101387 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r6gmj_1002b34d-f671-4b20-bf4f-492ce3295cc4/kube-multus/1.log" Dec 09 11:33:47 crc kubenswrapper[4745]: I1209 11:33:47.101880 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r6gmj_1002b34d-f671-4b20-bf4f-492ce3295cc4/kube-multus/0.log" Dec 09 11:33:47 crc kubenswrapper[4745]: I1209 11:33:47.101919 4745 generic.go:334] "Generic (PLEG): container finished" podID="1002b34d-f671-4b20-bf4f-492ce3295cc4" containerID="e88ce3abcb49b7d6904538b5b2ce3a0a151159de54b316e2824c860f29b18a38" exitCode=1 Dec 09 11:33:47 crc kubenswrapper[4745]: I1209 11:33:47.101945 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r6gmj" event={"ID":"1002b34d-f671-4b20-bf4f-492ce3295cc4","Type":"ContainerDied","Data":"e88ce3abcb49b7d6904538b5b2ce3a0a151159de54b316e2824c860f29b18a38"} Dec 09 11:33:47 crc kubenswrapper[4745]: I1209 11:33:47.101973 4745 scope.go:117] "RemoveContainer" containerID="c205900e2ccdd575b6c623eba8dd39d82e3c4e38badb8785f27fabc7914dc55f" Dec 09 11:33:47 crc kubenswrapper[4745]: I1209 11:33:47.102392 4745 scope.go:117] "RemoveContainer" containerID="e88ce3abcb49b7d6904538b5b2ce3a0a151159de54b316e2824c860f29b18a38" Dec 09 11:33:47 crc kubenswrapper[4745]: E1209 11:33:47.102556 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-r6gmj_openshift-multus(1002b34d-f671-4b20-bf4f-492ce3295cc4)\"" pod="openshift-multus/multus-r6gmj" podUID="1002b34d-f671-4b20-bf4f-492ce3295cc4" Dec 09 11:33:47 crc kubenswrapper[4745]: I1209 11:33:47.117381 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m966m" podStartSLOduration=94.117366488 podStartE2EDuration="1m34.117366488s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:33:30.070213306 +0000 UTC m=+96.895414830" watchObservedRunningTime="2025-12-09 11:33:47.117366488 +0000 UTC m=+113.942568012" Dec 09 11:33:47 crc kubenswrapper[4745]: I1209 11:33:47.554202 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:47 crc kubenswrapper[4745]: E1209 11:33:47.554338 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:48 crc kubenswrapper[4745]: I1209 11:33:48.105263 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r6gmj_1002b34d-f671-4b20-bf4f-492ce3295cc4/kube-multus/1.log" Dec 09 11:33:48 crc kubenswrapper[4745]: I1209 11:33:48.553824 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:48 crc kubenswrapper[4745]: I1209 11:33:48.553918 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:48 crc kubenswrapper[4745]: I1209 11:33:48.553836 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:48 crc kubenswrapper[4745]: E1209 11:33:48.553951 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:48 crc kubenswrapper[4745]: E1209 11:33:48.554047 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:48 crc kubenswrapper[4745]: E1209 11:33:48.554103 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:49 crc kubenswrapper[4745]: I1209 11:33:49.553839 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:49 crc kubenswrapper[4745]: E1209 11:33:49.553979 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:49 crc kubenswrapper[4745]: I1209 11:33:49.554562 4745 scope.go:117] "RemoveContainer" containerID="9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4" Dec 09 11:33:49 crc kubenswrapper[4745]: E1209 11:33:49.554681 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vwrlh_openshift-ovn-kubernetes(ac484d76-f5da-4880-868d-1e2e5289c025)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" Dec 09 11:33:50 crc kubenswrapper[4745]: I1209 11:33:50.554152 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:50 crc kubenswrapper[4745]: I1209 11:33:50.554152 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:50 crc kubenswrapper[4745]: I1209 11:33:50.554324 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:50 crc kubenswrapper[4745]: E1209 11:33:50.554625 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:50 crc kubenswrapper[4745]: E1209 11:33:50.554770 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:50 crc kubenswrapper[4745]: E1209 11:33:50.554987 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:51 crc kubenswrapper[4745]: I1209 11:33:51.554941 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:51 crc kubenswrapper[4745]: E1209 11:33:51.555804 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:52 crc kubenswrapper[4745]: I1209 11:33:52.554369 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:52 crc kubenswrapper[4745]: I1209 11:33:52.554399 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:52 crc kubenswrapper[4745]: E1209 11:33:52.554499 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:52 crc kubenswrapper[4745]: I1209 11:33:52.554368 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:52 crc kubenswrapper[4745]: E1209 11:33:52.554824 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:52 crc kubenswrapper[4745]: E1209 11:33:52.554999 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:53 crc kubenswrapper[4745]: E1209 11:33:53.536660 4745 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 09 11:33:53 crc kubenswrapper[4745]: I1209 11:33:53.553883 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:53 crc kubenswrapper[4745]: E1209 11:33:53.554838 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:53 crc kubenswrapper[4745]: E1209 11:33:53.707224 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 11:33:54 crc kubenswrapper[4745]: I1209 11:33:54.554108 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:54 crc kubenswrapper[4745]: I1209 11:33:54.554136 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:54 crc kubenswrapper[4745]: I1209 11:33:54.554164 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:54 crc kubenswrapper[4745]: E1209 11:33:54.555172 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:54 crc kubenswrapper[4745]: E1209 11:33:54.555215 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:54 crc kubenswrapper[4745]: E1209 11:33:54.555254 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:55 crc kubenswrapper[4745]: I1209 11:33:55.554120 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:55 crc kubenswrapper[4745]: E1209 11:33:55.554232 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:56 crc kubenswrapper[4745]: I1209 11:33:56.554106 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:56 crc kubenswrapper[4745]: E1209 11:33:56.554261 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:56 crc kubenswrapper[4745]: I1209 11:33:56.554476 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:56 crc kubenswrapper[4745]: E1209 11:33:56.554561 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:56 crc kubenswrapper[4745]: I1209 11:33:56.554661 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:56 crc kubenswrapper[4745]: E1209 11:33:56.554730 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:57 crc kubenswrapper[4745]: I1209 11:33:57.554821 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:57 crc kubenswrapper[4745]: E1209 11:33:57.554963 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:58 crc kubenswrapper[4745]: I1209 11:33:58.554522 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:33:58 crc kubenswrapper[4745]: I1209 11:33:58.554558 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:33:58 crc kubenswrapper[4745]: E1209 11:33:58.554670 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:33:58 crc kubenswrapper[4745]: I1209 11:33:58.554681 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:33:58 crc kubenswrapper[4745]: E1209 11:33:58.554783 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:33:58 crc kubenswrapper[4745]: E1209 11:33:58.554884 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:33:58 crc kubenswrapper[4745]: E1209 11:33:58.709175 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 11:33:59 crc kubenswrapper[4745]: I1209 11:33:59.554591 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:33:59 crc kubenswrapper[4745]: E1209 11:33:59.554717 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:33:59 crc kubenswrapper[4745]: I1209 11:33:59.555148 4745 scope.go:117] "RemoveContainer" containerID="e88ce3abcb49b7d6904538b5b2ce3a0a151159de54b316e2824c860f29b18a38" Dec 09 11:34:00 crc kubenswrapper[4745]: I1209 11:34:00.143816 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r6gmj_1002b34d-f671-4b20-bf4f-492ce3295cc4/kube-multus/1.log" Dec 09 11:34:00 crc kubenswrapper[4745]: I1209 11:34:00.144126 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r6gmj" event={"ID":"1002b34d-f671-4b20-bf4f-492ce3295cc4","Type":"ContainerStarted","Data":"a5f2ca3ad6920f0955798ebdde6246cdc51cf6e11a8a1877ceec813979e4dd16"} Dec 09 11:34:00 crc kubenswrapper[4745]: I1209 11:34:00.554553 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:34:00 crc kubenswrapper[4745]: I1209 11:34:00.554602 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:34:00 crc kubenswrapper[4745]: E1209 11:34:00.554745 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:34:00 crc kubenswrapper[4745]: I1209 11:34:00.554775 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:34:00 crc kubenswrapper[4745]: E1209 11:34:00.554954 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:34:00 crc kubenswrapper[4745]: E1209 11:34:00.555088 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:34:01 crc kubenswrapper[4745]: I1209 11:34:01.554490 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:34:01 crc kubenswrapper[4745]: E1209 11:34:01.554730 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:34:02 crc kubenswrapper[4745]: I1209 11:34:02.554322 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:34:02 crc kubenswrapper[4745]: I1209 11:34:02.554346 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:34:02 crc kubenswrapper[4745]: E1209 11:34:02.554455 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:34:02 crc kubenswrapper[4745]: E1209 11:34:02.554592 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:34:02 crc kubenswrapper[4745]: I1209 11:34:02.555410 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:34:02 crc kubenswrapper[4745]: E1209 11:34:02.555735 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:34:03 crc kubenswrapper[4745]: I1209 11:34:03.553901 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:34:03 crc kubenswrapper[4745]: E1209 11:34:03.555917 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:34:03 crc kubenswrapper[4745]: E1209 11:34:03.710041 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 11:34:04 crc kubenswrapper[4745]: I1209 11:34:04.554576 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:34:04 crc kubenswrapper[4745]: I1209 11:34:04.554623 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:34:04 crc kubenswrapper[4745]: E1209 11:34:04.554716 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:34:04 crc kubenswrapper[4745]: I1209 11:34:04.554734 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:34:04 crc kubenswrapper[4745]: E1209 11:34:04.554949 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:34:04 crc kubenswrapper[4745]: E1209 11:34:04.555145 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:34:04 crc kubenswrapper[4745]: I1209 11:34:04.555640 4745 scope.go:117] "RemoveContainer" containerID="9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4" Dec 09 11:34:05 crc kubenswrapper[4745]: I1209 11:34:05.168835 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovnkube-controller/3.log" Dec 09 11:34:05 crc kubenswrapper[4745]: I1209 11:34:05.172171 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerStarted","Data":"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39"} Dec 09 11:34:05 crc kubenswrapper[4745]: I1209 11:34:05.172620 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:34:05 crc kubenswrapper[4745]: I1209 11:34:05.258572 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podStartSLOduration=112.258551587 podStartE2EDuration="1m52.258551587s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:05.199707642 +0000 UTC m=+132.024909216" watchObservedRunningTime="2025-12-09 11:34:05.258551587 +0000 UTC m=+132.083753111" Dec 09 11:34:05 crc kubenswrapper[4745]: I1209 11:34:05.258841 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jdv4j"] Dec 09 11:34:05 crc kubenswrapper[4745]: I1209 11:34:05.258950 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:34:05 crc kubenswrapper[4745]: E1209 11:34:05.259053 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:34:06 crc kubenswrapper[4745]: I1209 11:34:06.553902 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:34:06 crc kubenswrapper[4745]: E1209 11:34:06.554316 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:34:06 crc kubenswrapper[4745]: I1209 11:34:06.554017 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:34:06 crc kubenswrapper[4745]: E1209 11:34:06.554402 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:34:06 crc kubenswrapper[4745]: I1209 11:34:06.554057 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:34:06 crc kubenswrapper[4745]: E1209 11:34:06.554465 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:34:06 crc kubenswrapper[4745]: I1209 11:34:06.553963 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:34:06 crc kubenswrapper[4745]: E1209 11:34:06.554555 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:34:08 crc kubenswrapper[4745]: I1209 11:34:08.554074 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:34:08 crc kubenswrapper[4745]: I1209 11:34:08.554096 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:34:08 crc kubenswrapper[4745]: E1209 11:34:08.554209 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdv4j" podUID="ea6befdd-80ca-42c2-813f-62a5cdff9605" Dec 09 11:34:08 crc kubenswrapper[4745]: I1209 11:34:08.554357 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:34:08 crc kubenswrapper[4745]: I1209 11:34:08.554364 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:34:08 crc kubenswrapper[4745]: E1209 11:34:08.554480 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 11:34:08 crc kubenswrapper[4745]: E1209 11:34:08.554838 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 11:34:08 crc kubenswrapper[4745]: E1209 11:34:08.554970 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.361031 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.410484 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rbfb"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.411132 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rbfb" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.414481 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lwwlq"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.415254 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.415578 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-pmsng"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.416083 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.418826 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.419068 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.419227 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.419340 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.419492 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hj9bn"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.419978 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zkn7h"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.430257 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.431402 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.432068 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.433084 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.462164 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.462860 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.463374 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.467291 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.467765 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.467805 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.468088 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.468251 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.468355 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.468452 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.468584 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.468806 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.469697 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.469846 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.470255 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.470612 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.471036 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.471172 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.472421 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.472578 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.472725 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.472975 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.473117 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.473239 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.473388 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.473579 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kjsvk"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.474326 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kjsvk" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.475344 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.475746 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.475907 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.476007 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.476104 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.476176 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.476420 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.476532 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.476568 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.477701 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.482541 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vwmfn"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.483135 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.483731 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.483154 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.484360 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.484827 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.484901 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.489079 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.489592 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.496890 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.500162 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.500432 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.500773 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.501259 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-8r64w"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.501624 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.501679 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.501741 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-mcrk5"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.501891 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.502064 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.502166 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.502350 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.502180 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.502137 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.502541 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8r64w" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.502605 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.502792 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.502919 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-audit\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.502953 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/50fd29d0-8021-4cf0-ad83-dc5c679aeb43-available-featuregates\") pod \"openshift-config-operator-7777fb866f-pmsng\" (UID: \"50fd29d0-8021-4cf0-ad83-dc5c679aeb43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.502978 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq2ws\" (UniqueName: \"kubernetes.io/projected/50fd29d0-8021-4cf0-ad83-dc5c679aeb43-kube-api-access-wq2ws\") pod \"openshift-config-operator-7777fb866f-pmsng\" (UID: \"50fd29d0-8021-4cf0-ad83-dc5c679aeb43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503004 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-client-ca\") pod \"controller-manager-879f6c89f-hj9bn\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503027 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/33da92e8-30d5-47b4-9d6a-496d4d1d1306-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zkn7h\" (UID: \"33da92e8-30d5-47b4-9d6a-496d4d1d1306\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503050 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-encryption-config\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503073 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-serving-cert\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503093 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cfa061d-49b8-4640-ae8d-674ef0832ef7-client-ca\") pod \"route-controller-manager-6576b87f9c-wjx47\" (UID: \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503140 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/33da92e8-30d5-47b4-9d6a-496d4d1d1306-images\") pod \"machine-api-operator-5694c8668f-zkn7h\" (UID: \"33da92e8-30d5-47b4-9d6a-496d4d1d1306\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503164 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hj9bn\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503189 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5c3ac7c1-b7a4-4a1a-9279-723fe4e26a04-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8rbfb\" (UID: \"5c3ac7c1-b7a4-4a1a-9279-723fe4e26a04\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rbfb" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503210 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33da92e8-30d5-47b4-9d6a-496d4d1d1306-config\") pod \"machine-api-operator-5694c8668f-zkn7h\" (UID: \"33da92e8-30d5-47b4-9d6a-496d4d1d1306\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503240 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfa061d-49b8-4640-ae8d-674ef0832ef7-config\") pod \"route-controller-manager-6576b87f9c-wjx47\" (UID: \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503272 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwg9x\" (UniqueName: \"kubernetes.io/projected/6cfa061d-49b8-4640-ae8d-674ef0832ef7-kube-api-access-lwg9x\") pod \"route-controller-manager-6576b87f9c-wjx47\" (UID: \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503315 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-etcd-client\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503356 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50fd29d0-8021-4cf0-ad83-dc5c679aeb43-serving-cert\") pod \"openshift-config-operator-7777fb866f-pmsng\" (UID: \"50fd29d0-8021-4cf0-ad83-dc5c679aeb43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503376 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-config\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503395 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa5c85c2-26a5-44d3-a759-080eb6198c6d-serving-cert\") pod \"controller-manager-879f6c89f-hj9bn\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503413 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqvmq\" (UniqueName: \"kubernetes.io/projected/fa5c85c2-26a5-44d3-a759-080eb6198c6d-kube-api-access-pqvmq\") pod \"controller-manager-879f6c89f-hj9bn\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503433 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cfa061d-49b8-4640-ae8d-674ef0832ef7-serving-cert\") pod \"route-controller-manager-6576b87f9c-wjx47\" (UID: \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503454 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-image-import-ca\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503473 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-audit-dir\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503493 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svdn2\" (UniqueName: \"kubernetes.io/projected/33da92e8-30d5-47b4-9d6a-496d4d1d1306-kube-api-access-svdn2\") pod \"machine-api-operator-5694c8668f-zkn7h\" (UID: \"33da92e8-30d5-47b4-9d6a-496d4d1d1306\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503532 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503563 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxcrg\" (UniqueName: \"kubernetes.io/projected/5c3ac7c1-b7a4-4a1a-9279-723fe4e26a04-kube-api-access-cxcrg\") pod \"cluster-samples-operator-665b6dd947-8rbfb\" (UID: \"5c3ac7c1-b7a4-4a1a-9279-723fe4e26a04\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rbfb" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503584 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-node-pullsecrets\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503606 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdz2s\" (UniqueName: \"kubernetes.io/projected/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-kube-api-access-pdz2s\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503625 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-etcd-serving-ca\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.503647 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-config\") pod \"controller-manager-879f6c89f-hj9bn\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.504919 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.505901 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.506010 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.506282 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.506723 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.507279 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.507496 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.508467 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hk5ns"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.508969 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hk5ns" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.509384 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.509873 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.510472 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vxndp"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.510863 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vxndp" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.514077 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c87ls"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.524729 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.525046 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.532792 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.532925 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.533208 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.533743 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.533940 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.534212 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.534377 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.542235 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.542308 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.542348 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.542471 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.543253 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.544951 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.545171 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.550445 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.551535 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.552053 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.552198 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.552459 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.552636 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.553039 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.553274 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.553441 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.554097 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.556980 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.557018 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.557220 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.557361 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.559297 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.559368 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.559415 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.559484 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.559500 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.559746 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.559786 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.560180 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.561157 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.561360 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.561487 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.562911 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.563105 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.563293 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.563402 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.563485 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.564289 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.564476 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.564564 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.564626 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.564479 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.564699 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.564741 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.564804 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.566439 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.567859 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.571051 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.571920 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.572151 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nrkr5"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.572301 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.572691 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nnh4w"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.572937 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.573280 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.573634 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.575218 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrkr5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.575219 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.575320 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.575346 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.575362 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nnh4w" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.576887 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.581806 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.581950 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.582440 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w5n22"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.582578 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.582811 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.582956 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.583108 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lsfbd"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.583216 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.587197 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-j258s"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.587297 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.589098 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.589343 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.589914 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.590089 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.590836 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-mnbl6"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.591638 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.591748 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.594850 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.596676 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rbfb"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.600062 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-pmsng"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.602045 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hj9bn"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.603496 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605481 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxcrg\" (UniqueName: \"kubernetes.io/projected/5c3ac7c1-b7a4-4a1a-9279-723fe4e26a04-kube-api-access-cxcrg\") pod \"cluster-samples-operator-665b6dd947-8rbfb\" (UID: \"5c3ac7c1-b7a4-4a1a-9279-723fe4e26a04\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rbfb" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605541 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605572 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52737279-7072-4bdb-9e9b-8e8cd58d44c9-trusted-ca\") pod \"ingress-operator-5b745b69d9-snhrr\" (UID: \"52737279-7072-4bdb-9e9b-8e8cd58d44c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605592 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea7071e0-9d7c-4b66-b404-01511502284c-signing-key\") pod \"service-ca-9c57cc56f-hk5ns\" (UID: \"ea7071e0-9d7c-4b66-b404-01511502284c\") " pod="openshift-service-ca/service-ca-9c57cc56f-hk5ns" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605615 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-node-pullsecrets\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605636 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdz2s\" (UniqueName: \"kubernetes.io/projected/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-kube-api-access-pdz2s\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605656 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-oauth-serving-cert\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605678 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-etcd-serving-ca\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605700 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-config\") pod \"controller-manager-879f6c89f-hj9bn\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605718 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-serving-cert\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605746 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b47373e1-3530-4cfd-a139-02674d232523-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8vlj\" (UID: \"b47373e1-3530-4cfd-a139-02674d232523\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605769 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcfrg\" (UniqueName: \"kubernetes.io/projected/8a071b7a-930e-46f6-91d3-3aefcacf5eec-kube-api-access-hcfrg\") pod \"machine-approver-56656f9798-b2kgm\" (UID: \"8a071b7a-930e-46f6-91d3-3aefcacf5eec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605790 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-audit\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605812 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/50fd29d0-8021-4cf0-ad83-dc5c679aeb43-available-featuregates\") pod \"openshift-config-operator-7777fb866f-pmsng\" (UID: \"50fd29d0-8021-4cf0-ad83-dc5c679aeb43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605834 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsqhp\" (UniqueName: \"kubernetes.io/projected/8c51afdf-fde4-4147-814c-8befb1ad7d1f-kube-api-access-dsqhp\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605858 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38d93c0b-6422-4a07-98dd-5a9eacdc1fa3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-47jpc\" (UID: \"38d93c0b-6422-4a07-98dd-5a9eacdc1fa3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605883 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq2ws\" (UniqueName: \"kubernetes.io/projected/50fd29d0-8021-4cf0-ad83-dc5c679aeb43-kube-api-access-wq2ws\") pod \"openshift-config-operator-7777fb866f-pmsng\" (UID: \"50fd29d0-8021-4cf0-ad83-dc5c679aeb43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605907 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2cbz\" (UniqueName: \"kubernetes.io/projected/ea7071e0-9d7c-4b66-b404-01511502284c-kube-api-access-w2cbz\") pod \"service-ca-9c57cc56f-hk5ns\" (UID: \"ea7071e0-9d7c-4b66-b404-01511502284c\") " pod="openshift-service-ca/service-ca-9c57cc56f-hk5ns" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605931 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-client-ca\") pod \"controller-manager-879f6c89f-hj9bn\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605952 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/33da92e8-30d5-47b4-9d6a-496d4d1d1306-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zkn7h\" (UID: \"33da92e8-30d5-47b4-9d6a-496d4d1d1306\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605974 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-encryption-config\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.605996 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-serving-cert\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606016 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-config\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606040 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z988s\" (UniqueName: \"kubernetes.io/projected/38d93c0b-6422-4a07-98dd-5a9eacdc1fa3-kube-api-access-z988s\") pod \"openshift-controller-manager-operator-756b6f6bc6-47jpc\" (UID: \"38d93c0b-6422-4a07-98dd-5a9eacdc1fa3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606063 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eb738d6-4a1b-4790-9376-0f416784bf8e-config\") pod \"authentication-operator-69f744f599-vwmfn\" (UID: \"4eb738d6-4a1b-4790-9376-0f416784bf8e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606084 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9r2d\" (UniqueName: \"kubernetes.io/projected/770be29f-5b58-4569-a4a8-0618adb2ed5c-kube-api-access-s9r2d\") pod \"olm-operator-6b444d44fb-65bct\" (UID: \"770be29f-5b58-4569-a4a8-0618adb2ed5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606106 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-serving-cert\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606129 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cfa061d-49b8-4640-ae8d-674ef0832ef7-client-ca\") pod \"route-controller-manager-6576b87f9c-wjx47\" (UID: \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606151 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8d6488-91ee-4d61-a22d-377b3d2c8aab-config\") pod \"openshift-apiserver-operator-796bbdcf4f-p57l4\" (UID: \"7b8d6488-91ee-4d61-a22d-377b3d2c8aab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606169 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-service-ca\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606188 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tknt\" (UniqueName: \"kubernetes.io/projected/52737279-7072-4bdb-9e9b-8e8cd58d44c9-kube-api-access-4tknt\") pod \"ingress-operator-5b745b69d9-snhrr\" (UID: \"52737279-7072-4bdb-9e9b-8e8cd58d44c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606208 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-etcd-client\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606228 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b8d6488-91ee-4d61-a22d-377b3d2c8aab-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-p57l4\" (UID: \"7b8d6488-91ee-4d61-a22d-377b3d2c8aab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606275 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e84e12-ef0d-46b5-8d5b-fde7b040e11b-config\") pod \"kube-controller-manager-operator-78b949d7b-9hssr\" (UID: \"69e84e12-ef0d-46b5-8d5b-fde7b040e11b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606299 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a41a4525-7677-4db8-b040-d4c1edfcc9a0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p6nj4\" (UID: \"a41a4525-7677-4db8-b040-d4c1edfcc9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606326 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/33da92e8-30d5-47b4-9d6a-496d4d1d1306-images\") pod \"machine-api-operator-5694c8668f-zkn7h\" (UID: \"33da92e8-30d5-47b4-9d6a-496d4d1d1306\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606353 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfc45955-ae9e-460c-97df-1d23f960c862-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wvs6z\" (UID: \"cfc45955-ae9e-460c-97df-1d23f960c862\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606386 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4eb738d6-4a1b-4790-9376-0f416784bf8e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vwmfn\" (UID: \"4eb738d6-4a1b-4790-9376-0f416784bf8e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606413 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hj9bn\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606437 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sxw5b\" (UID: \"5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606464 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfc45955-ae9e-460c-97df-1d23f960c862-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wvs6z\" (UID: \"cfc45955-ae9e-460c-97df-1d23f960c862\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606486 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4eb738d6-4a1b-4790-9376-0f416784bf8e-serving-cert\") pod \"authentication-operator-69f744f599-vwmfn\" (UID: \"4eb738d6-4a1b-4790-9376-0f416784bf8e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606527 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sxw5b\" (UID: \"5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606606 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-node-pullsecrets\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.606676 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lwwlq"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.607298 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-client-ca\") pod \"controller-manager-879f6c89f-hj9bn\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.607409 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-etcd-serving-ca\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.607554 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7ftnn"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.608350 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/33da92e8-30d5-47b4-9d6a-496d4d1d1306-images\") pod \"machine-api-operator-5694c8668f-zkn7h\" (UID: \"33da92e8-30d5-47b4-9d6a-496d4d1d1306\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.608380 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7ftnn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.608498 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cfa061d-49b8-4640-ae8d-674ef0832ef7-client-ca\") pod \"route-controller-manager-6576b87f9c-wjx47\" (UID: \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.608640 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/50fd29d0-8021-4cf0-ad83-dc5c679aeb43-available-featuregates\") pod \"openshift-config-operator-7777fb866f-pmsng\" (UID: \"50fd29d0-8021-4cf0-ad83-dc5c679aeb43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.609582 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-audit\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.609823 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zkn7h"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610033 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-config\") pod \"controller-manager-879f6c89f-hj9bn\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610107 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pkdt\" (UniqueName: \"kubernetes.io/projected/cfc45955-ae9e-460c-97df-1d23f960c862-kube-api-access-9pkdt\") pod \"cluster-image-registry-operator-dc59b4c8b-wvs6z\" (UID: \"cfc45955-ae9e-460c-97df-1d23f960c862\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610140 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfa061d-49b8-4640-ae8d-674ef0832ef7-config\") pod \"route-controller-manager-6576b87f9c-wjx47\" (UID: \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610175 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5c3ac7c1-b7a4-4a1a-9279-723fe4e26a04-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8rbfb\" (UID: \"5c3ac7c1-b7a4-4a1a-9279-723fe4e26a04\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rbfb" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610193 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33da92e8-30d5-47b4-9d6a-496d4d1d1306-config\") pod \"machine-api-operator-5694c8668f-zkn7h\" (UID: \"33da92e8-30d5-47b4-9d6a-496d4d1d1306\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610212 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwg9x\" (UniqueName: \"kubernetes.io/projected/6cfa061d-49b8-4640-ae8d-674ef0832ef7-kube-api-access-lwg9x\") pod \"route-controller-manager-6576b87f9c-wjx47\" (UID: \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610230 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-trusted-ca-bundle\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610249 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xslc7\" (UniqueName: \"kubernetes.io/projected/5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6-kube-api-access-xslc7\") pod \"kube-storage-version-migrator-operator-b67b599dd-sxw5b\" (UID: \"5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610265 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/413895ff-f52a-401e-a1f4-67a2ee0bc52c-config\") pod \"console-operator-58897d9998-kjsvk\" (UID: \"413895ff-f52a-401e-a1f4-67a2ee0bc52c\") " pod="openshift-console-operator/console-operator-58897d9998-kjsvk" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610280 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e84e12-ef0d-46b5-8d5b-fde7b040e11b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9hssr\" (UID: \"69e84e12-ef0d-46b5-8d5b-fde7b040e11b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610296 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a41a4525-7677-4db8-b040-d4c1edfcc9a0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p6nj4\" (UID: \"a41a4525-7677-4db8-b040-d4c1edfcc9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610323 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-etcd-client\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610341 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sl9j\" (UniqueName: \"kubernetes.io/projected/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-kube-api-access-9sl9j\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610362 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a41a4525-7677-4db8-b040-d4c1edfcc9a0-config\") pod \"kube-apiserver-operator-766d6c64bb-p6nj4\" (UID: \"a41a4525-7677-4db8-b040-d4c1edfcc9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610398 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-audit-policies\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610423 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-oauth-config\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610449 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1295af6d-c037-4fa4-adfb-1d43919d86ed-metrics-tls\") pod \"dns-operator-744455d44c-vxndp\" (UID: \"1295af6d-c037-4fa4-adfb-1d43919d86ed\") " pod="openshift-dns-operator/dns-operator-744455d44c-vxndp" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610473 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a071b7a-930e-46f6-91d3-3aefcacf5eec-config\") pod \"machine-approver-56656f9798-b2kgm\" (UID: \"8a071b7a-930e-46f6-91d3-3aefcacf5eec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610497 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw54t\" (UniqueName: \"kubernetes.io/projected/d02382a6-7e0e-4274-bbe2-e713ac39756c-kube-api-access-dw54t\") pod \"downloads-7954f5f757-8r64w\" (UID: \"d02382a6-7e0e-4274-bbe2-e713ac39756c\") " pod="openshift-console/downloads-7954f5f757-8r64w" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610553 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38d93c0b-6422-4a07-98dd-5a9eacdc1fa3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-47jpc\" (UID: \"38d93c0b-6422-4a07-98dd-5a9eacdc1fa3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610571 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79gtn\" (UniqueName: \"kubernetes.io/projected/7b8d6488-91ee-4d61-a22d-377b3d2c8aab-kube-api-access-79gtn\") pod \"openshift-apiserver-operator-796bbdcf4f-p57l4\" (UID: \"7b8d6488-91ee-4d61-a22d-377b3d2c8aab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610587 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-etcd-service-ca\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610604 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610620 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-encryption-config\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610634 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52737279-7072-4bdb-9e9b-8e8cd58d44c9-metrics-tls\") pod \"ingress-operator-5b745b69d9-snhrr\" (UID: \"52737279-7072-4bdb-9e9b-8e8cd58d44c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610647 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52737279-7072-4bdb-9e9b-8e8cd58d44c9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-snhrr\" (UID: \"52737279-7072-4bdb-9e9b-8e8cd58d44c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610680 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50fd29d0-8021-4cf0-ad83-dc5c679aeb43-serving-cert\") pod \"openshift-config-operator-7777fb866f-pmsng\" (UID: \"50fd29d0-8021-4cf0-ad83-dc5c679aeb43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610697 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww5jh\" (UniqueName: \"kubernetes.io/projected/1295af6d-c037-4fa4-adfb-1d43919d86ed-kube-api-access-ww5jh\") pod \"dns-operator-744455d44c-vxndp\" (UID: \"1295af6d-c037-4fa4-adfb-1d43919d86ed\") " pod="openshift-dns-operator/dns-operator-744455d44c-vxndp" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610714 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b47373e1-3530-4cfd-a139-02674d232523-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8vlj\" (UID: \"b47373e1-3530-4cfd-a139-02674d232523\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610729 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b47373e1-3530-4cfd-a139-02674d232523-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8vlj\" (UID: \"b47373e1-3530-4cfd-a139-02674d232523\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610748 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-config\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610767 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea7071e0-9d7c-4b66-b404-01511502284c-signing-cabundle\") pod \"service-ca-9c57cc56f-hk5ns\" (UID: \"ea7071e0-9d7c-4b66-b404-01511502284c\") " pod="openshift-service-ca/service-ca-9c57cc56f-hk5ns" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610789 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-config\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610804 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-serving-cert\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610820 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/413895ff-f52a-401e-a1f4-67a2ee0bc52c-serving-cert\") pod \"console-operator-58897d9998-kjsvk\" (UID: \"413895ff-f52a-401e-a1f4-67a2ee0bc52c\") " pod="openshift-console-operator/console-operator-58897d9998-kjsvk" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610834 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69e84e12-ef0d-46b5-8d5b-fde7b040e11b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9hssr\" (UID: \"69e84e12-ef0d-46b5-8d5b-fde7b040e11b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610850 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/770be29f-5b58-4569-a4a8-0618adb2ed5c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-65bct\" (UID: \"770be29f-5b58-4569-a4a8-0618adb2ed5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610867 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa5c85c2-26a5-44d3-a759-080eb6198c6d-serving-cert\") pod \"controller-manager-879f6c89f-hj9bn\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610883 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqvmq\" (UniqueName: \"kubernetes.io/projected/fa5c85c2-26a5-44d3-a759-080eb6198c6d-kube-api-access-pqvmq\") pod \"controller-manager-879f6c89f-hj9bn\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610897 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cfa061d-49b8-4640-ae8d-674ef0832ef7-serving-cert\") pod \"route-controller-manager-6576b87f9c-wjx47\" (UID: \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610916 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jgcb\" (UniqueName: \"kubernetes.io/projected/413895ff-f52a-401e-a1f4-67a2ee0bc52c-kube-api-access-6jgcb\") pod \"console-operator-58897d9998-kjsvk\" (UID: \"413895ff-f52a-401e-a1f4-67a2ee0bc52c\") " pod="openshift-console-operator/console-operator-58897d9998-kjsvk" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610933 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-image-import-ca\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610949 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-audit-dir\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610967 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfc45955-ae9e-460c-97df-1d23f960c862-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wvs6z\" (UID: \"cfc45955-ae9e-460c-97df-1d23f960c862\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610982 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/413895ff-f52a-401e-a1f4-67a2ee0bc52c-trusted-ca\") pod \"console-operator-58897d9998-kjsvk\" (UID: \"413895ff-f52a-401e-a1f4-67a2ee0bc52c\") " pod="openshift-console-operator/console-operator-58897d9998-kjsvk" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.610996 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/770be29f-5b58-4569-a4a8-0618adb2ed5c-srv-cert\") pod \"olm-operator-6b444d44fb-65bct\" (UID: \"770be29f-5b58-4569-a4a8-0618adb2ed5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.611017 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4eb738d6-4a1b-4790-9376-0f416784bf8e-service-ca-bundle\") pod \"authentication-operator-69f744f599-vwmfn\" (UID: \"4eb738d6-4a1b-4790-9376-0f416784bf8e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.611034 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8a071b7a-930e-46f6-91d3-3aefcacf5eec-machine-approver-tls\") pod \"machine-approver-56656f9798-b2kgm\" (UID: \"8a071b7a-930e-46f6-91d3-3aefcacf5eec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.611052 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svdn2\" (UniqueName: \"kubernetes.io/projected/33da92e8-30d5-47b4-9d6a-496d4d1d1306-kube-api-access-svdn2\") pod \"machine-api-operator-5694c8668f-zkn7h\" (UID: \"33da92e8-30d5-47b4-9d6a-496d4d1d1306\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.611069 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-audit-dir\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.611084 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.611100 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmdh6\" (UniqueName: \"kubernetes.io/projected/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-kube-api-access-dmdh6\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.611116 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-etcd-ca\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.611131 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4rjh\" (UniqueName: \"kubernetes.io/projected/4eb738d6-4a1b-4790-9376-0f416784bf8e-kube-api-access-b4rjh\") pod \"authentication-operator-69f744f599-vwmfn\" (UID: \"4eb738d6-4a1b-4790-9376-0f416784bf8e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.611146 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8a071b7a-930e-46f6-91d3-3aefcacf5eec-auth-proxy-config\") pod \"machine-approver-56656f9798-b2kgm\" (UID: \"8a071b7a-930e-46f6-91d3-3aefcacf5eec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.611170 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-etcd-client\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.611874 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hj9bn\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.613037 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-audit-dir\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.613235 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-encryption-config\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.613971 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-image-import-ca\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.613978 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.614086 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.614466 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-config\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.615005 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-serving-cert\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.615280 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33da92e8-30d5-47b4-9d6a-496d4d1d1306-config\") pod \"machine-api-operator-5694c8668f-zkn7h\" (UID: \"33da92e8-30d5-47b4-9d6a-496d4d1d1306\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.616607 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-etcd-client\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.616812 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50fd29d0-8021-4cf0-ad83-dc5c679aeb43-serving-cert\") pod \"openshift-config-operator-7777fb866f-pmsng\" (UID: \"50fd29d0-8021-4cf0-ad83-dc5c679aeb43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.616835 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kjsvk"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.616863 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.616873 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.617315 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa5c85c2-26a5-44d3-a759-080eb6198c6d-serving-cert\") pod \"controller-manager-879f6c89f-hj9bn\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.617350 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vwmfn"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.617386 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5c3ac7c1-b7a4-4a1a-9279-723fe4e26a04-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8rbfb\" (UID: \"5c3ac7c1-b7a4-4a1a-9279-723fe4e26a04\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rbfb" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.617840 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfa061d-49b8-4640-ae8d-674ef0832ef7-config\") pod \"route-controller-manager-6576b87f9c-wjx47\" (UID: \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.618043 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.618432 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cfa061d-49b8-4640-ae8d-674ef0832ef7-serving-cert\") pod \"route-controller-manager-6576b87f9c-wjx47\" (UID: \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.619535 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-mcrk5"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.620545 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.622549 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.625022 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-lttn7"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.625577 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-lttn7" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.626205 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.630628 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/33da92e8-30d5-47b4-9d6a-496d4d1d1306-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zkn7h\" (UID: \"33da92e8-30d5-47b4-9d6a-496d4d1d1306\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.634425 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.635181 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.641099 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.641965 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lsfbd"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.649874 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.654736 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.655556 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vxndp"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.658829 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c87ls"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.658877 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.658892 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hk5ns"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.658974 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.660195 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.661596 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8r64w"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.664404 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p4p6k"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.669963 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jv8nd"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.670744 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nrkr5"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.670800 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.670815 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.670826 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nnh4w"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.670883 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p4p6k" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.670949 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7ftnn"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.670998 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jv8nd" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.672478 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.673647 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.673829 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-j258s"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.675191 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p4p6k"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.677233 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jv8nd"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.678714 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.679987 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w5n22"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.682097 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.683074 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-r79qf"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.686314 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-r79qf"] Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.686426 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.693852 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.712470 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-trusted-ca-bundle\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713180 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xslc7\" (UniqueName: \"kubernetes.io/projected/5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6-kube-api-access-xslc7\") pod \"kube-storage-version-migrator-operator-b67b599dd-sxw5b\" (UID: \"5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713202 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/413895ff-f52a-401e-a1f4-67a2ee0bc52c-config\") pod \"console-operator-58897d9998-kjsvk\" (UID: \"413895ff-f52a-401e-a1f4-67a2ee0bc52c\") " pod="openshift-console-operator/console-operator-58897d9998-kjsvk" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713218 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e84e12-ef0d-46b5-8d5b-fde7b040e11b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9hssr\" (UID: \"69e84e12-ef0d-46b5-8d5b-fde7b040e11b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713233 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a41a4525-7677-4db8-b040-d4c1edfcc9a0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p6nj4\" (UID: \"a41a4525-7677-4db8-b040-d4c1edfcc9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713271 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sl9j\" (UniqueName: \"kubernetes.io/projected/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-kube-api-access-9sl9j\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713285 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a41a4525-7677-4db8-b040-d4c1edfcc9a0-config\") pod \"kube-apiserver-operator-766d6c64bb-p6nj4\" (UID: \"a41a4525-7677-4db8-b040-d4c1edfcc9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713305 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3824312f-b03f-42da-890e-53a61841a8b0-audit-dir\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713329 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-audit-policies\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713346 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-oauth-config\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713361 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1295af6d-c037-4fa4-adfb-1d43919d86ed-metrics-tls\") pod \"dns-operator-744455d44c-vxndp\" (UID: \"1295af6d-c037-4fa4-adfb-1d43919d86ed\") " pod="openshift-dns-operator/dns-operator-744455d44c-vxndp" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713376 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a071b7a-930e-46f6-91d3-3aefcacf5eec-config\") pod \"machine-approver-56656f9798-b2kgm\" (UID: \"8a071b7a-930e-46f6-91d3-3aefcacf5eec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713392 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw54t\" (UniqueName: \"kubernetes.io/projected/d02382a6-7e0e-4274-bbe2-e713ac39756c-kube-api-access-dw54t\") pod \"downloads-7954f5f757-8r64w\" (UID: \"d02382a6-7e0e-4274-bbe2-e713ac39756c\") " pod="openshift-console/downloads-7954f5f757-8r64w" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713407 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38d93c0b-6422-4a07-98dd-5a9eacdc1fa3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-47jpc\" (UID: \"38d93c0b-6422-4a07-98dd-5a9eacdc1fa3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713422 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713439 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79gtn\" (UniqueName: \"kubernetes.io/projected/7b8d6488-91ee-4d61-a22d-377b3d2c8aab-kube-api-access-79gtn\") pod \"openshift-apiserver-operator-796bbdcf4f-p57l4\" (UID: \"7b8d6488-91ee-4d61-a22d-377b3d2c8aab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713454 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-etcd-service-ca\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713468 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713483 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-encryption-config\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713496 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52737279-7072-4bdb-9e9b-8e8cd58d44c9-metrics-tls\") pod \"ingress-operator-5b745b69d9-snhrr\" (UID: \"52737279-7072-4bdb-9e9b-8e8cd58d44c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713548 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j68kj\" (UniqueName: \"kubernetes.io/projected/3824312f-b03f-42da-890e-53a61841a8b0-kube-api-access-j68kj\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713565 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww5jh\" (UniqueName: \"kubernetes.io/projected/1295af6d-c037-4fa4-adfb-1d43919d86ed-kube-api-access-ww5jh\") pod \"dns-operator-744455d44c-vxndp\" (UID: \"1295af6d-c037-4fa4-adfb-1d43919d86ed\") " pod="openshift-dns-operator/dns-operator-744455d44c-vxndp" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713581 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52737279-7072-4bdb-9e9b-8e8cd58d44c9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-snhrr\" (UID: \"52737279-7072-4bdb-9e9b-8e8cd58d44c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713599 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b47373e1-3530-4cfd-a139-02674d232523-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8vlj\" (UID: \"b47373e1-3530-4cfd-a139-02674d232523\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713614 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b47373e1-3530-4cfd-a139-02674d232523-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8vlj\" (UID: \"b47373e1-3530-4cfd-a139-02674d232523\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713630 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-config\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713645 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea7071e0-9d7c-4b66-b404-01511502284c-signing-cabundle\") pod \"service-ca-9c57cc56f-hk5ns\" (UID: \"ea7071e0-9d7c-4b66-b404-01511502284c\") " pod="openshift-service-ca/service-ca-9c57cc56f-hk5ns" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713661 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-serving-cert\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713674 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/413895ff-f52a-401e-a1f4-67a2ee0bc52c-serving-cert\") pod \"console-operator-58897d9998-kjsvk\" (UID: \"413895ff-f52a-401e-a1f4-67a2ee0bc52c\") " pod="openshift-console-operator/console-operator-58897d9998-kjsvk" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713687 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69e84e12-ef0d-46b5-8d5b-fde7b040e11b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9hssr\" (UID: \"69e84e12-ef0d-46b5-8d5b-fde7b040e11b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713705 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/770be29f-5b58-4569-a4a8-0618adb2ed5c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-65bct\" (UID: \"770be29f-5b58-4569-a4a8-0618adb2ed5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713735 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfc45955-ae9e-460c-97df-1d23f960c862-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wvs6z\" (UID: \"cfc45955-ae9e-460c-97df-1d23f960c862\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713757 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/413895ff-f52a-401e-a1f4-67a2ee0bc52c-trusted-ca\") pod \"console-operator-58897d9998-kjsvk\" (UID: \"413895ff-f52a-401e-a1f4-67a2ee0bc52c\") " pod="openshift-console-operator/console-operator-58897d9998-kjsvk" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713779 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jgcb\" (UniqueName: \"kubernetes.io/projected/413895ff-f52a-401e-a1f4-67a2ee0bc52c-kube-api-access-6jgcb\") pod \"console-operator-58897d9998-kjsvk\" (UID: \"413895ff-f52a-401e-a1f4-67a2ee0bc52c\") " pod="openshift-console-operator/console-operator-58897d9998-kjsvk" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713796 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4eb738d6-4a1b-4790-9376-0f416784bf8e-service-ca-bundle\") pod \"authentication-operator-69f744f599-vwmfn\" (UID: \"4eb738d6-4a1b-4790-9376-0f416784bf8e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713811 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8a071b7a-930e-46f6-91d3-3aefcacf5eec-machine-approver-tls\") pod \"machine-approver-56656f9798-b2kgm\" (UID: \"8a071b7a-930e-46f6-91d3-3aefcacf5eec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713825 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/770be29f-5b58-4569-a4a8-0618adb2ed5c-srv-cert\") pod \"olm-operator-6b444d44fb-65bct\" (UID: \"770be29f-5b58-4569-a4a8-0618adb2ed5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713843 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-audit-dir\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713859 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713875 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmdh6\" (UniqueName: \"kubernetes.io/projected/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-kube-api-access-dmdh6\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713890 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-etcd-ca\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713904 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4rjh\" (UniqueName: \"kubernetes.io/projected/4eb738d6-4a1b-4790-9376-0f416784bf8e-kube-api-access-b4rjh\") pod \"authentication-operator-69f744f599-vwmfn\" (UID: \"4eb738d6-4a1b-4790-9376-0f416784bf8e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713918 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8a071b7a-930e-46f6-91d3-3aefcacf5eec-auth-proxy-config\") pod \"machine-approver-56656f9798-b2kgm\" (UID: \"8a071b7a-930e-46f6-91d3-3aefcacf5eec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713933 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-etcd-client\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713947 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713975 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.713989 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52737279-7072-4bdb-9e9b-8e8cd58d44c9-trusted-ca\") pod \"ingress-operator-5b745b69d9-snhrr\" (UID: \"52737279-7072-4bdb-9e9b-8e8cd58d44c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714003 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea7071e0-9d7c-4b66-b404-01511502284c-signing-key\") pod \"service-ca-9c57cc56f-hk5ns\" (UID: \"ea7071e0-9d7c-4b66-b404-01511502284c\") " pod="openshift-service-ca/service-ca-9c57cc56f-hk5ns" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714021 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714043 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-oauth-serving-cert\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714066 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-serving-cert\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714082 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcfrg\" (UniqueName: \"kubernetes.io/projected/8a071b7a-930e-46f6-91d3-3aefcacf5eec-kube-api-access-hcfrg\") pod \"machine-approver-56656f9798-b2kgm\" (UID: \"8a071b7a-930e-46f6-91d3-3aefcacf5eec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714098 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b47373e1-3530-4cfd-a139-02674d232523-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8vlj\" (UID: \"b47373e1-3530-4cfd-a139-02674d232523\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714113 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsqhp\" (UniqueName: \"kubernetes.io/projected/8c51afdf-fde4-4147-814c-8befb1ad7d1f-kube-api-access-dsqhp\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714129 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38d93c0b-6422-4a07-98dd-5a9eacdc1fa3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-47jpc\" (UID: \"38d93c0b-6422-4a07-98dd-5a9eacdc1fa3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714144 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714161 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714176 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714197 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2cbz\" (UniqueName: \"kubernetes.io/projected/ea7071e0-9d7c-4b66-b404-01511502284c-kube-api-access-w2cbz\") pod \"service-ca-9c57cc56f-hk5ns\" (UID: \"ea7071e0-9d7c-4b66-b404-01511502284c\") " pod="openshift-service-ca/service-ca-9c57cc56f-hk5ns" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714214 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714228 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714245 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-serving-cert\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714260 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-config\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714277 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z988s\" (UniqueName: \"kubernetes.io/projected/38d93c0b-6422-4a07-98dd-5a9eacdc1fa3-kube-api-access-z988s\") pod \"openshift-controller-manager-operator-756b6f6bc6-47jpc\" (UID: \"38d93c0b-6422-4a07-98dd-5a9eacdc1fa3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714293 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eb738d6-4a1b-4790-9376-0f416784bf8e-config\") pod \"authentication-operator-69f744f599-vwmfn\" (UID: \"4eb738d6-4a1b-4790-9376-0f416784bf8e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714308 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8d6488-91ee-4d61-a22d-377b3d2c8aab-config\") pod \"openshift-apiserver-operator-796bbdcf4f-p57l4\" (UID: \"7b8d6488-91ee-4d61-a22d-377b3d2c8aab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714323 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9r2d\" (UniqueName: \"kubernetes.io/projected/770be29f-5b58-4569-a4a8-0618adb2ed5c-kube-api-access-s9r2d\") pod \"olm-operator-6b444d44fb-65bct\" (UID: \"770be29f-5b58-4569-a4a8-0618adb2ed5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714339 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714356 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-audit-policies\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714373 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-service-ca\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714390 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tknt\" (UniqueName: \"kubernetes.io/projected/52737279-7072-4bdb-9e9b-8e8cd58d44c9-kube-api-access-4tknt\") pod \"ingress-operator-5b745b69d9-snhrr\" (UID: \"52737279-7072-4bdb-9e9b-8e8cd58d44c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714405 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5372cf2-1aac-4a33-96ae-f6e7e612195a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w5n22\" (UID: \"e5372cf2-1aac-4a33-96ae-f6e7e612195a\") " pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714421 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b2x2\" (UniqueName: \"kubernetes.io/projected/e5372cf2-1aac-4a33-96ae-f6e7e612195a-kube-api-access-2b2x2\") pod \"marketplace-operator-79b997595-w5n22\" (UID: \"e5372cf2-1aac-4a33-96ae-f6e7e612195a\") " pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714441 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-etcd-client\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714457 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b8d6488-91ee-4d61-a22d-377b3d2c8aab-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-p57l4\" (UID: \"7b8d6488-91ee-4d61-a22d-377b3d2c8aab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714474 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e84e12-ef0d-46b5-8d5b-fde7b040e11b-config\") pod \"kube-controller-manager-operator-78b949d7b-9hssr\" (UID: \"69e84e12-ef0d-46b5-8d5b-fde7b040e11b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714488 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a41a4525-7677-4db8-b040-d4c1edfcc9a0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p6nj4\" (UID: \"a41a4525-7677-4db8-b040-d4c1edfcc9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714519 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfc45955-ae9e-460c-97df-1d23f960c862-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wvs6z\" (UID: \"cfc45955-ae9e-460c-97df-1d23f960c862\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714534 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4eb738d6-4a1b-4790-9376-0f416784bf8e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vwmfn\" (UID: \"4eb738d6-4a1b-4790-9376-0f416784bf8e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714549 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714566 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sxw5b\" (UID: \"5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714584 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfc45955-ae9e-460c-97df-1d23f960c862-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wvs6z\" (UID: \"cfc45955-ae9e-460c-97df-1d23f960c862\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714599 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4eb738d6-4a1b-4790-9376-0f416784bf8e-serving-cert\") pod \"authentication-operator-69f744f599-vwmfn\" (UID: \"4eb738d6-4a1b-4790-9376-0f416784bf8e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714615 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sxw5b\" (UID: \"5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714631 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pkdt\" (UniqueName: \"kubernetes.io/projected/cfc45955-ae9e-460c-97df-1d23f960c862-kube-api-access-9pkdt\") pod \"cluster-image-registry-operator-dc59b4c8b-wvs6z\" (UID: \"cfc45955-ae9e-460c-97df-1d23f960c862\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.714646 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e5372cf2-1aac-4a33-96ae-f6e7e612195a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w5n22\" (UID: \"e5372cf2-1aac-4a33-96ae-f6e7e612195a\") " pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.715665 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-trusted-ca-bundle\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.715979 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-audit-dir\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.716682 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4eb738d6-4a1b-4790-9376-0f416784bf8e-service-ca-bundle\") pod \"authentication-operator-69f744f599-vwmfn\" (UID: \"4eb738d6-4a1b-4790-9376-0f416784bf8e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.716994 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/413895ff-f52a-401e-a1f4-67a2ee0bc52c-trusted-ca\") pod \"console-operator-58897d9998-kjsvk\" (UID: \"413895ff-f52a-401e-a1f4-67a2ee0bc52c\") " pod="openshift-console-operator/console-operator-58897d9998-kjsvk" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.717176 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-etcd-service-ca\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.717360 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.717528 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.717700 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-config\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.717842 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-etcd-ca\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.718228 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea7071e0-9d7c-4b66-b404-01511502284c-signing-cabundle\") pod \"service-ca-9c57cc56f-hk5ns\" (UID: \"ea7071e0-9d7c-4b66-b404-01511502284c\") " pod="openshift-service-ca/service-ca-9c57cc56f-hk5ns" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.718811 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-audit-policies\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.719143 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8a071b7a-930e-46f6-91d3-3aefcacf5eec-machine-approver-tls\") pod \"machine-approver-56656f9798-b2kgm\" (UID: \"8a071b7a-930e-46f6-91d3-3aefcacf5eec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.719377 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-etcd-client\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.719453 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/413895ff-f52a-401e-a1f4-67a2ee0bc52c-config\") pod \"console-operator-58897d9998-kjsvk\" (UID: \"413895ff-f52a-401e-a1f4-67a2ee0bc52c\") " pod="openshift-console-operator/console-operator-58897d9998-kjsvk" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.720036 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8a071b7a-930e-46f6-91d3-3aefcacf5eec-auth-proxy-config\") pod \"machine-approver-56656f9798-b2kgm\" (UID: \"8a071b7a-930e-46f6-91d3-3aefcacf5eec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.720691 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eb738d6-4a1b-4790-9376-0f416784bf8e-config\") pod \"authentication-operator-69f744f599-vwmfn\" (UID: \"4eb738d6-4a1b-4790-9376-0f416784bf8e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.720797 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-service-ca\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.721297 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8d6488-91ee-4d61-a22d-377b3d2c8aab-config\") pod \"openshift-apiserver-operator-796bbdcf4f-p57l4\" (UID: \"7b8d6488-91ee-4d61-a22d-377b3d2c8aab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.721369 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-oauth-serving-cert\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.721008 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38d93c0b-6422-4a07-98dd-5a9eacdc1fa3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-47jpc\" (UID: \"38d93c0b-6422-4a07-98dd-5a9eacdc1fa3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.722318 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.723009 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a071b7a-930e-46f6-91d3-3aefcacf5eec-config\") pod \"machine-approver-56656f9798-b2kgm\" (UID: \"8a071b7a-930e-46f6-91d3-3aefcacf5eec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.724833 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-config\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.725274 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-serving-cert\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.725426 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfc45955-ae9e-460c-97df-1d23f960c862-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wvs6z\" (UID: \"cfc45955-ae9e-460c-97df-1d23f960c862\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.725499 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-encryption-config\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.725557 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfc45955-ae9e-460c-97df-1d23f960c862-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wvs6z\" (UID: \"cfc45955-ae9e-460c-97df-1d23f960c862\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.725570 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/413895ff-f52a-401e-a1f4-67a2ee0bc52c-serving-cert\") pod \"console-operator-58897d9998-kjsvk\" (UID: \"413895ff-f52a-401e-a1f4-67a2ee0bc52c\") " pod="openshift-console-operator/console-operator-58897d9998-kjsvk" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.725762 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/770be29f-5b58-4569-a4a8-0618adb2ed5c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-65bct\" (UID: \"770be29f-5b58-4569-a4a8-0618adb2ed5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.725881 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-oauth-config\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.726020 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea7071e0-9d7c-4b66-b404-01511502284c-signing-key\") pod \"service-ca-9c57cc56f-hk5ns\" (UID: \"ea7071e0-9d7c-4b66-b404-01511502284c\") " pod="openshift-service-ca/service-ca-9c57cc56f-hk5ns" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.725974 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38d93c0b-6422-4a07-98dd-5a9eacdc1fa3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-47jpc\" (UID: \"38d93c0b-6422-4a07-98dd-5a9eacdc1fa3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.726089 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4eb738d6-4a1b-4790-9376-0f416784bf8e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vwmfn\" (UID: \"4eb738d6-4a1b-4790-9376-0f416784bf8e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.726110 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-serving-cert\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.726097 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-etcd-client\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.726482 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b8d6488-91ee-4d61-a22d-377b3d2c8aab-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-p57l4\" (UID: \"7b8d6488-91ee-4d61-a22d-377b3d2c8aab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.726565 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/770be29f-5b58-4569-a4a8-0618adb2ed5c-srv-cert\") pod \"olm-operator-6b444d44fb-65bct\" (UID: \"770be29f-5b58-4569-a4a8-0618adb2ed5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.727063 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4eb738d6-4a1b-4790-9376-0f416784bf8e-serving-cert\") pod \"authentication-operator-69f744f599-vwmfn\" (UID: \"4eb738d6-4a1b-4790-9376-0f416784bf8e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.727274 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-serving-cert\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.729372 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1295af6d-c037-4fa4-adfb-1d43919d86ed-metrics-tls\") pod \"dns-operator-744455d44c-vxndp\" (UID: \"1295af6d-c037-4fa4-adfb-1d43919d86ed\") " pod="openshift-dns-operator/dns-operator-744455d44c-vxndp" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.731671 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52737279-7072-4bdb-9e9b-8e8cd58d44c9-metrics-tls\") pod \"ingress-operator-5b745b69d9-snhrr\" (UID: \"52737279-7072-4bdb-9e9b-8e8cd58d44c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.739208 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.748885 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52737279-7072-4bdb-9e9b-8e8cd58d44c9-trusted-ca\") pod \"ingress-operator-5b745b69d9-snhrr\" (UID: \"52737279-7072-4bdb-9e9b-8e8cd58d44c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.753695 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.773362 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.813352 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.815892 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.815941 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.815960 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.816009 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.816025 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.816045 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.816070 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.816086 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.816114 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-audit-policies\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.816129 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.816145 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b2x2\" (UniqueName: \"kubernetes.io/projected/e5372cf2-1aac-4a33-96ae-f6e7e612195a-kube-api-access-2b2x2\") pod \"marketplace-operator-79b997595-w5n22\" (UID: \"e5372cf2-1aac-4a33-96ae-f6e7e612195a\") " pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.816168 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5372cf2-1aac-4a33-96ae-f6e7e612195a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w5n22\" (UID: \"e5372cf2-1aac-4a33-96ae-f6e7e612195a\") " pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.816195 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.816232 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e5372cf2-1aac-4a33-96ae-f6e7e612195a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w5n22\" (UID: \"e5372cf2-1aac-4a33-96ae-f6e7e612195a\") " pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.816278 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3824312f-b03f-42da-890e-53a61841a8b0-audit-dir\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.816321 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.816343 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j68kj\" (UniqueName: \"kubernetes.io/projected/3824312f-b03f-42da-890e-53a61841a8b0-kube-api-access-j68kj\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.816388 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3824312f-b03f-42da-890e-53a61841a8b0-audit-dir\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.833037 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.853111 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.873229 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.883818 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sxw5b\" (UID: \"5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.893270 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.913855 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.922224 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sxw5b\" (UID: \"5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.933539 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.954111 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.959431 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e84e12-ef0d-46b5-8d5b-fde7b040e11b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9hssr\" (UID: \"69e84e12-ef0d-46b5-8d5b-fde7b040e11b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.973912 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.982870 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b47373e1-3530-4cfd-a139-02674d232523-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8vlj\" (UID: \"b47373e1-3530-4cfd-a139-02674d232523\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.993521 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 09 11:34:09 crc kubenswrapper[4745]: I1209 11:34:09.998289 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b47373e1-3530-4cfd-a139-02674d232523-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8vlj\" (UID: \"b47373e1-3530-4cfd-a139-02674d232523\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.013274 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.033318 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.043210 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e84e12-ef0d-46b5-8d5b-fde7b040e11b-config\") pod \"kube-controller-manager-operator-78b949d7b-9hssr\" (UID: \"69e84e12-ef0d-46b5-8d5b-fde7b040e11b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.053943 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.073810 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.088029 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a41a4525-7677-4db8-b040-d4c1edfcc9a0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p6nj4\" (UID: \"a41a4525-7677-4db8-b040-d4c1edfcc9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.094018 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.114070 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.117379 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a41a4525-7677-4db8-b040-d4c1edfcc9a0-config\") pod \"kube-apiserver-operator-766d6c64bb-p6nj4\" (UID: \"a41a4525-7677-4db8-b040-d4c1edfcc9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.134922 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.154725 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.174575 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.193443 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.215099 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.233834 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.253025 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.274157 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.294087 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.314646 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.335132 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.354313 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.373909 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.393716 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.401637 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e5372cf2-1aac-4a33-96ae-f6e7e612195a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w5n22\" (UID: \"e5372cf2-1aac-4a33-96ae-f6e7e612195a\") " pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.420224 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.428181 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5372cf2-1aac-4a33-96ae-f6e7e612195a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w5n22\" (UID: \"e5372cf2-1aac-4a33-96ae-f6e7e612195a\") " pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.434627 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.454117 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.474616 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.494001 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.514251 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.534526 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.554391 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.554431 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.554494 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.554398 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.555046 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.563100 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.573821 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.593652 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.598723 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.612773 4745 request.go:700] Waited for 1.019553638s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/secrets?fieldSelector=metadata.name%3Dv4-0-config-system-serving-cert&limit=500&resourceVersion=0 Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.614368 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.619956 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.634186 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.637964 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.659213 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.671790 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.673239 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.678719 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.693287 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.713491 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.719924 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.735520 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.753628 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.759894 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.784250 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.788872 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.793857 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.813803 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 09 11:34:10 crc kubenswrapper[4745]: E1209 11:34:10.817102 4745 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 09 11:34:10 crc kubenswrapper[4745]: E1209 11:34:10.817126 4745 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: failed to sync configmap cache: timed out waiting for the condition Dec 09 11:34:10 crc kubenswrapper[4745]: E1209 11:34:10.817177 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-trusted-ca-bundle podName:3824312f-b03f-42da-890e-53a61841a8b0 nodeName:}" failed. No retries permitted until 2025-12-09 11:34:11.31715474 +0000 UTC m=+138.142356264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-trusted-ca-bundle") pod "oauth-openshift-558db77b4-j258s" (UID: "3824312f-b03f-42da-890e-53a61841a8b0") : failed to sync configmap cache: timed out waiting for the condition Dec 09 11:34:10 crc kubenswrapper[4745]: E1209 11:34:10.817235 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-cliconfig podName:3824312f-b03f-42da-890e-53a61841a8b0 nodeName:}" failed. No retries permitted until 2025-12-09 11:34:11.317213702 +0000 UTC m=+138.142415226 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-cliconfig") pod "oauth-openshift-558db77b4-j258s" (UID: "3824312f-b03f-42da-890e-53a61841a8b0") : failed to sync configmap cache: timed out waiting for the condition Dec 09 11:34:10 crc kubenswrapper[4745]: E1209 11:34:10.817492 4745 configmap.go:193] Couldn't get configMap openshift-authentication/audit: failed to sync configmap cache: timed out waiting for the condition Dec 09 11:34:10 crc kubenswrapper[4745]: E1209 11:34:10.817724 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-audit-policies podName:3824312f-b03f-42da-890e-53a61841a8b0 nodeName:}" failed. No retries permitted until 2025-12-09 11:34:11.317670545 +0000 UTC m=+138.142872119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-audit-policies") pod "oauth-openshift-558db77b4-j258s" (UID: "3824312f-b03f-42da-890e-53a61841a8b0") : failed to sync configmap cache: timed out waiting for the condition Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.840289 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.853963 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.873792 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.893434 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.913914 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.933200 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.954340 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.973908 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 09 11:34:10 crc kubenswrapper[4745]: I1209 11:34:10.994010 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.014249 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.033803 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.054187 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.073901 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.093505 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.114029 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.134669 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.167148 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxcrg\" (UniqueName: \"kubernetes.io/projected/5c3ac7c1-b7a4-4a1a-9279-723fe4e26a04-kube-api-access-cxcrg\") pod \"cluster-samples-operator-665b6dd947-8rbfb\" (UID: \"5c3ac7c1-b7a4-4a1a-9279-723fe4e26a04\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rbfb" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.187447 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdz2s\" (UniqueName: \"kubernetes.io/projected/c500e9ec-eac6-45e4-bb6d-209e92ffbdad-kube-api-access-pdz2s\") pod \"apiserver-76f77b778f-lwwlq\" (UID: \"c500e9ec-eac6-45e4-bb6d-209e92ffbdad\") " pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.194054 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.226854 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq2ws\" (UniqueName: \"kubernetes.io/projected/50fd29d0-8021-4cf0-ad83-dc5c679aeb43-kube-api-access-wq2ws\") pod \"openshift-config-operator-7777fb866f-pmsng\" (UID: \"50fd29d0-8021-4cf0-ad83-dc5c679aeb43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.233550 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.262824 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rbfb" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.266790 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwg9x\" (UniqueName: \"kubernetes.io/projected/6cfa061d-49b8-4640-ae8d-674ef0832ef7-kube-api-access-lwg9x\") pod \"route-controller-manager-6576b87f9c-wjx47\" (UID: \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.286110 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.290731 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svdn2\" (UniqueName: \"kubernetes.io/projected/33da92e8-30d5-47b4-9d6a-496d4d1d1306-kube-api-access-svdn2\") pod \"machine-api-operator-5694c8668f-zkn7h\" (UID: \"33da92e8-30d5-47b4-9d6a-496d4d1d1306\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.301082 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.311859 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqvmq\" (UniqueName: \"kubernetes.io/projected/fa5c85c2-26a5-44d3-a759-080eb6198c6d-kube-api-access-pqvmq\") pod \"controller-manager-879f6c89f-hj9bn\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.314585 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.333378 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.335641 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-audit-policies\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.335702 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.335903 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.336758 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.337082 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-audit-policies\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.337847 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.354785 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.382600 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.390759 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.394448 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.413982 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.435089 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.453435 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rbfb"] Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.455783 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.475267 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.482559 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lwwlq"] Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.493641 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.517632 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-pmsng"] Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.518552 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.533130 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.554008 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zkn7h"] Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.554645 4745 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.573845 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.593132 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47"] Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.609673 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jgcb\" (UniqueName: \"kubernetes.io/projected/413895ff-f52a-401e-a1f4-67a2ee0bc52c-kube-api-access-6jgcb\") pod \"console-operator-58897d9998-kjsvk\" (UID: \"413895ff-f52a-401e-a1f4-67a2ee0bc52c\") " pod="openshift-console-operator/console-operator-58897d9998-kjsvk" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.610589 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.612978 4745 request.go:700] Waited for 1.896297949s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver-operator/serviceaccounts/kube-apiserver-operator/token Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.630422 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a41a4525-7677-4db8-b040-d4c1edfcc9a0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p6nj4\" (UID: \"a41a4525-7677-4db8-b040-d4c1edfcc9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.646583 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsqhp\" (UniqueName: \"kubernetes.io/projected/8c51afdf-fde4-4147-814c-8befb1ad7d1f-kube-api-access-dsqhp\") pod \"console-f9d7485db-mcrk5\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.667261 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52737279-7072-4bdb-9e9b-8e8cd58d44c9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-snhrr\" (UID: \"52737279-7072-4bdb-9e9b-8e8cd58d44c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.685723 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b47373e1-3530-4cfd-a139-02674d232523-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8vlj\" (UID: \"b47373e1-3530-4cfd-a139-02674d232523\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.699204 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kjsvk" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.706151 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw54t\" (UniqueName: \"kubernetes.io/projected/d02382a6-7e0e-4274-bbe2-e713ac39756c-kube-api-access-dw54t\") pod \"downloads-7954f5f757-8r64w\" (UID: \"d02382a6-7e0e-4274-bbe2-e713ac39756c\") " pod="openshift-console/downloads-7954f5f757-8r64w" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.752322 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4rjh\" (UniqueName: \"kubernetes.io/projected/4eb738d6-4a1b-4790-9376-0f416784bf8e-kube-api-access-b4rjh\") pod \"authentication-operator-69f744f599-vwmfn\" (UID: \"4eb738d6-4a1b-4790-9376-0f416784bf8e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.771017 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sl9j\" (UniqueName: \"kubernetes.io/projected/e0e69154-b28d-4d09-9fd7-3e28b08d20cd-kube-api-access-9sl9j\") pod \"etcd-operator-b45778765-c87ls\" (UID: \"e0e69154-b28d-4d09-9fd7-3e28b08d20cd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.786180 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8r64w" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.786677 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcfrg\" (UniqueName: \"kubernetes.io/projected/8a071b7a-930e-46f6-91d3-3aefcacf5eec-kube-api-access-hcfrg\") pod \"machine-approver-56656f9798-b2kgm\" (UID: \"8a071b7a-930e-46f6-91d3-3aefcacf5eec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.796588 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.806403 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww5jh\" (UniqueName: \"kubernetes.io/projected/1295af6d-c037-4fa4-adfb-1d43919d86ed-kube-api-access-ww5jh\") pod \"dns-operator-744455d44c-vxndp\" (UID: \"1295af6d-c037-4fa4-adfb-1d43919d86ed\") " pod="openshift-dns-operator/dns-operator-744455d44c-vxndp" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.832847 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2cbz\" (UniqueName: \"kubernetes.io/projected/ea7071e0-9d7c-4b66-b404-01511502284c-kube-api-access-w2cbz\") pod \"service-ca-9c57cc56f-hk5ns\" (UID: \"ea7071e0-9d7c-4b66-b404-01511502284c\") " pod="openshift-service-ca/service-ca-9c57cc56f-hk5ns" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.852285 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9r2d\" (UniqueName: \"kubernetes.io/projected/770be29f-5b58-4569-a4a8-0618adb2ed5c-kube-api-access-s9r2d\") pod \"olm-operator-6b444d44fb-65bct\" (UID: \"770be29f-5b58-4569-a4a8-0618adb2ed5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.863858 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vxndp" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.871969 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.875358 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tknt\" (UniqueName: \"kubernetes.io/projected/52737279-7072-4bdb-9e9b-8e8cd58d44c9-kube-api-access-4tknt\") pod \"ingress-operator-5b745b69d9-snhrr\" (UID: \"52737279-7072-4bdb-9e9b-8e8cd58d44c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.881952 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.892642 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79gtn\" (UniqueName: \"kubernetes.io/projected/7b8d6488-91ee-4d61-a22d-377b3d2c8aab-kube-api-access-79gtn\") pod \"openshift-apiserver-operator-796bbdcf4f-p57l4\" (UID: \"7b8d6488-91ee-4d61-a22d-377b3d2c8aab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.897935 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.912570 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.921769 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69e84e12-ef0d-46b5-8d5b-fde7b040e11b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9hssr\" (UID: \"69e84e12-ef0d-46b5-8d5b-fde7b040e11b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.927119 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xslc7\" (UniqueName: \"kubernetes.io/projected/5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6-kube-api-access-xslc7\") pod \"kube-storage-version-migrator-operator-b67b599dd-sxw5b\" (UID: \"5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.954988 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pkdt\" (UniqueName: \"kubernetes.io/projected/cfc45955-ae9e-460c-97df-1d23f960c862-kube-api-access-9pkdt\") pod \"cluster-image-registry-operator-dc59b4c8b-wvs6z\" (UID: \"cfc45955-ae9e-460c-97df-1d23f960c862\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.970161 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfc45955-ae9e-460c-97df-1d23f960c862-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wvs6z\" (UID: \"cfc45955-ae9e-460c-97df-1d23f960c862\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" Dec 09 11:34:11 crc kubenswrapper[4745]: I1209 11:34:11.990747 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z988s\" (UniqueName: \"kubernetes.io/projected/38d93c0b-6422-4a07-98dd-5a9eacdc1fa3-kube-api-access-z988s\") pod \"openshift-controller-manager-operator-756b6f6bc6-47jpc\" (UID: \"38d93c0b-6422-4a07-98dd-5a9eacdc1fa3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.017494 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.031006 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b2x2\" (UniqueName: \"kubernetes.io/projected/e5372cf2-1aac-4a33-96ae-f6e7e612195a-kube-api-access-2b2x2\") pod \"marketplace-operator-79b997595-w5n22\" (UID: \"e5372cf2-1aac-4a33-96ae-f6e7e612195a\") " pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.048562 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.050409 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j68kj\" (UniqueName: \"kubernetes.io/projected/3824312f-b03f-42da-890e-53a61841a8b0-kube-api-access-j68kj\") pod \"oauth-openshift-558db77b4-j258s\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.053267 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.065138 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.074396 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.079759 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.093758 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.103289 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.109703 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hk5ns" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.113421 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.119378 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.133741 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.156785 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.195261 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.195641 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" event={"ID":"50fd29d0-8021-4cf0-ad83-dc5c679aeb43","Type":"ContainerStarted","Data":"5fa6e6cba7a66ea246e7fa6a533e407f7cde7ad40b7c8be0cca6a2ac04164d6a"} Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.196264 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmdh6\" (UniqueName: \"kubernetes.io/projected/719b893a-e0dd-4a09-86ba-2c5b177ba8b6-kube-api-access-dmdh6\") pod \"apiserver-7bbb656c7d-r6r9g\" (UID: \"719b893a-e0dd-4a09-86ba-2c5b177ba8b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.198604 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" event={"ID":"33da92e8-30d5-47b4-9d6a-496d4d1d1306","Type":"ContainerStarted","Data":"5f8d60831f3cb4d6f690a5fc0d23ece5afb5b7c513248070c7725d4c9e969ee9"} Dec 09 11:34:12 crc kubenswrapper[4745]: W1209 11:34:12.198564 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cfa061d_49b8_4640_ae8d_674ef0832ef7.slice/crio-4bc2151f7fc44cac8584d90132475f6f75a8fafabd1bee4d597c7cdd5e038173 WatchSource:0}: Error finding container 4bc2151f7fc44cac8584d90132475f6f75a8fafabd1bee4d597c7cdd5e038173: Status 404 returned error can't find the container with id 4bc2151f7fc44cac8584d90132475f6f75a8fafabd1bee4d597c7cdd5e038173 Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.200263 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" event={"ID":"c500e9ec-eac6-45e4-bb6d-209e92ffbdad","Type":"ContainerStarted","Data":"c78b65caedcc7f13c38c8369fbf821348cbe32d7cf694bba266e735926ecb92c"} Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.206018 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.252469 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4cmw\" (UniqueName: \"kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-kube-api-access-j4cmw\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.252574 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e774e15d-2c99-453e-9c78-4fde0bf037fc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.252667 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/28f4c688-bb4b-4268-b2f5-21739872a26d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-n7w49\" (UID: \"28f4c688-bb4b-4268-b2f5-21739872a26d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.252706 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e-srv-cert\") pod \"catalog-operator-68c6474976-zhvrm\" (UID: \"73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.252722 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-bound-sa-token\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.252752 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k48zg\" (UniqueName: \"kubernetes.io/projected/ba30ed9b-bec0-4977-ad91-91128d2d7636-kube-api-access-k48zg\") pod \"machine-config-controller-84d6567774-2qt5c\" (UID: \"ba30ed9b-bec0-4977-ad91-91128d2d7636\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.252790 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/62a21827-2dc4-47ee-88b8-af5721a23829-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7ftnn\" (UID: \"62a21827-2dc4-47ee-88b8-af5721a23829\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7ftnn" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.252804 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frz7n\" (UniqueName: \"kubernetes.io/projected/aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b-kube-api-access-frz7n\") pod \"service-ca-operator-777779d784-ppx9x\" (UID: \"aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.252820 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t882\" (UniqueName: \"kubernetes.io/projected/852830c1-9fd1-4c23-807a-fef5c5934c82-kube-api-access-9t882\") pod \"router-default-5444994796-mnbl6\" (UID: \"852830c1-9fd1-4c23-807a-fef5c5934c82\") " pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.252871 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e774e15d-2c99-453e-9c78-4fde0bf037fc-trusted-ca\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.252947 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z5vq\" (UniqueName: \"kubernetes.io/projected/28f4c688-bb4b-4268-b2f5-21739872a26d-kube-api-access-4z5vq\") pod \"package-server-manager-789f6589d5-n7w49\" (UID: \"28f4c688-bb4b-4268-b2f5-21739872a26d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.252973 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/179fdaf2-ddd5-4d07-acfc-44f510ed76c7-node-bootstrap-token\") pod \"machine-config-server-lttn7\" (UID: \"179fdaf2-ddd5-4d07-acfc-44f510ed76c7\") " pod="openshift-machine-config-operator/machine-config-server-lttn7" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.252988 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/852830c1-9fd1-4c23-807a-fef5c5934c82-stats-auth\") pod \"router-default-5444994796-mnbl6\" (UID: \"852830c1-9fd1-4c23-807a-fef5c5934c82\") " pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.253030 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba30ed9b-bec0-4977-ad91-91128d2d7636-proxy-tls\") pod \"machine-config-controller-84d6567774-2qt5c\" (UID: \"ba30ed9b-bec0-4977-ad91-91128d2d7636\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.253065 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b-config\") pod \"service-ca-operator-777779d784-ppx9x\" (UID: \"aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.253129 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsqzp\" (UniqueName: \"kubernetes.io/projected/4c700629-9c94-4e16-b584-e28f2103e68e-kube-api-access-zsqzp\") pod \"machine-config-operator-74547568cd-wlvvb\" (UID: \"4c700629-9c94-4e16-b584-e28f2103e68e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.253146 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e774e15d-2c99-453e-9c78-4fde0bf037fc-registry-certificates\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.253206 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-secret-volume\") pod \"collect-profiles-29421330-9k5cb\" (UID: \"ecd48ed5-6e8d-40f5-b3d1-1254df80a033\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.253372 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nppfc\" (UniqueName: \"kubernetes.io/projected/321f0b7a-1cf4-4814-81d3-fe25fd718555-kube-api-access-nppfc\") pod \"packageserver-d55dfcdfc-72wx4\" (UID: \"321f0b7a-1cf4-4814-81d3-fe25fd718555\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.253436 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/852830c1-9fd1-4c23-807a-fef5c5934c82-service-ca-bundle\") pod \"router-default-5444994796-mnbl6\" (UID: \"852830c1-9fd1-4c23-807a-fef5c5934c82\") " pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.253499 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e774e15d-2c99-453e-9c78-4fde0bf037fc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.253648 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c700629-9c94-4e16-b584-e28f2103e68e-proxy-tls\") pod \"machine-config-operator-74547568cd-wlvvb\" (UID: \"4c700629-9c94-4e16-b584-e28f2103e68e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.253701 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e697eb94-a732-4bb6-90c2-cd97e857b60b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nnh4w\" (UID: \"e697eb94-a732-4bb6-90c2-cd97e857b60b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nnh4w" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.253795 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-registry-tls\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.253817 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krf6l\" (UniqueName: \"kubernetes.io/projected/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-kube-api-access-krf6l\") pod \"collect-profiles-29421330-9k5cb\" (UID: \"ecd48ed5-6e8d-40f5-b3d1-1254df80a033\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.253882 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5jpq\" (UniqueName: \"kubernetes.io/projected/179fdaf2-ddd5-4d07-acfc-44f510ed76c7-kube-api-access-g5jpq\") pod \"machine-config-server-lttn7\" (UID: \"179fdaf2-ddd5-4d07-acfc-44f510ed76c7\") " pod="openshift-machine-config-operator/machine-config-server-lttn7" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.253899 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/321f0b7a-1cf4-4814-81d3-fe25fd718555-tmpfs\") pod \"packageserver-d55dfcdfc-72wx4\" (UID: \"321f0b7a-1cf4-4814-81d3-fe25fd718555\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.253979 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tws4w\" (UniqueName: \"kubernetes.io/projected/62a21827-2dc4-47ee-88b8-af5721a23829-kube-api-access-tws4w\") pod \"multus-admission-controller-857f4d67dd-7ftnn\" (UID: \"62a21827-2dc4-47ee-88b8-af5721a23829\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7ftnn" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.254063 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl75f\" (UniqueName: \"kubernetes.io/projected/e697eb94-a732-4bb6-90c2-cd97e857b60b-kube-api-access-hl75f\") pod \"control-plane-machine-set-operator-78cbb6b69f-nnh4w\" (UID: \"e697eb94-a732-4bb6-90c2-cd97e857b60b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nnh4w" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.254198 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-config-volume\") pod \"collect-profiles-29421330-9k5cb\" (UID: \"ecd48ed5-6e8d-40f5-b3d1-1254df80a033\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.254295 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/179fdaf2-ddd5-4d07-acfc-44f510ed76c7-certs\") pod \"machine-config-server-lttn7\" (UID: \"179fdaf2-ddd5-4d07-acfc-44f510ed76c7\") " pod="openshift-machine-config-operator/machine-config-server-lttn7" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.254332 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4c700629-9c94-4e16-b584-e28f2103e68e-images\") pod \"machine-config-operator-74547568cd-wlvvb\" (UID: \"4c700629-9c94-4e16-b584-e28f2103e68e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.254361 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c700629-9c94-4e16-b584-e28f2103e68e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wlvvb\" (UID: \"4c700629-9c94-4e16-b584-e28f2103e68e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.254462 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b-serving-cert\") pod \"service-ca-operator-777779d784-ppx9x\" (UID: \"aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.254499 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgm6l\" (UniqueName: \"kubernetes.io/projected/5146ef22-be05-4272-bfa7-80452a0a908f-kube-api-access-dgm6l\") pod \"migrator-59844c95c7-nrkr5\" (UID: \"5146ef22-be05-4272-bfa7-80452a0a908f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrkr5" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.254564 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.254602 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzj4q\" (UniqueName: \"kubernetes.io/projected/73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e-kube-api-access-gzj4q\") pod \"catalog-operator-68c6474976-zhvrm\" (UID: \"73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.254679 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/321f0b7a-1cf4-4814-81d3-fe25fd718555-apiservice-cert\") pod \"packageserver-d55dfcdfc-72wx4\" (UID: \"321f0b7a-1cf4-4814-81d3-fe25fd718555\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.254713 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e-profile-collector-cert\") pod \"catalog-operator-68c6474976-zhvrm\" (UID: \"73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.254762 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/321f0b7a-1cf4-4814-81d3-fe25fd718555-webhook-cert\") pod \"packageserver-d55dfcdfc-72wx4\" (UID: \"321f0b7a-1cf4-4814-81d3-fe25fd718555\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.254813 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/852830c1-9fd1-4c23-807a-fef5c5934c82-metrics-certs\") pod \"router-default-5444994796-mnbl6\" (UID: \"852830c1-9fd1-4c23-807a-fef5c5934c82\") " pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.254845 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ba30ed9b-bec0-4977-ad91-91128d2d7636-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2qt5c\" (UID: \"ba30ed9b-bec0-4977-ad91-91128d2d7636\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.254901 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/852830c1-9fd1-4c23-807a-fef5c5934c82-default-certificate\") pod \"router-default-5444994796-mnbl6\" (UID: \"852830c1-9fd1-4c23-807a-fef5c5934c82\") " pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:12 crc kubenswrapper[4745]: E1209 11:34:12.267632 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:12.767603761 +0000 UTC m=+139.592805365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.268246 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.284388 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.308724 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.355711 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:12 crc kubenswrapper[4745]: E1209 11:34:12.355981 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:12.855953966 +0000 UTC m=+139.681155490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.356387 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b-serving-cert\") pod \"service-ca-operator-777779d784-ppx9x\" (UID: \"aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.356431 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgm6l\" (UniqueName: \"kubernetes.io/projected/5146ef22-be05-4272-bfa7-80452a0a908f-kube-api-access-dgm6l\") pod \"migrator-59844c95c7-nrkr5\" (UID: \"5146ef22-be05-4272-bfa7-80452a0a908f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrkr5" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.356461 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: E1209 11:34:12.356820 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:12.85681005 +0000 UTC m=+139.682011574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.356897 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzj4q\" (UniqueName: \"kubernetes.io/projected/73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e-kube-api-access-gzj4q\") pod \"catalog-operator-68c6474976-zhvrm\" (UID: \"73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.357044 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/321f0b7a-1cf4-4814-81d3-fe25fd718555-apiservice-cert\") pod \"packageserver-d55dfcdfc-72wx4\" (UID: \"321f0b7a-1cf4-4814-81d3-fe25fd718555\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.357094 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3675753d-4d26-4ec5-9abd-4e6f8966f56a-config-volume\") pod \"dns-default-jv8nd\" (UID: \"3675753d-4d26-4ec5-9abd-4e6f8966f56a\") " pod="openshift-dns/dns-default-jv8nd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.357481 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/321f0b7a-1cf4-4814-81d3-fe25fd718555-webhook-cert\") pod \"packageserver-d55dfcdfc-72wx4\" (UID: \"321f0b7a-1cf4-4814-81d3-fe25fd718555\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.358722 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e-profile-collector-cert\") pod \"catalog-operator-68c6474976-zhvrm\" (UID: \"73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.358967 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6brcw\" (UniqueName: \"kubernetes.io/projected/5fc24512-e828-4d9c-acde-b03f888e9474-kube-api-access-6brcw\") pod \"ingress-canary-p4p6k\" (UID: \"5fc24512-e828-4d9c-acde-b03f888e9474\") " pod="openshift-ingress-canary/ingress-canary-p4p6k" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.359032 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ba30ed9b-bec0-4977-ad91-91128d2d7636-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2qt5c\" (UID: \"ba30ed9b-bec0-4977-ad91-91128d2d7636\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.359107 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/852830c1-9fd1-4c23-807a-fef5c5934c82-metrics-certs\") pod \"router-default-5444994796-mnbl6\" (UID: \"852830c1-9fd1-4c23-807a-fef5c5934c82\") " pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.359376 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/852830c1-9fd1-4c23-807a-fef5c5934c82-default-certificate\") pod \"router-default-5444994796-mnbl6\" (UID: \"852830c1-9fd1-4c23-807a-fef5c5934c82\") " pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360036 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b5078140-21f5-4159-96af-69a29c27ec36-mountpoint-dir\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360158 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4cmw\" (UniqueName: \"kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-kube-api-access-j4cmw\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360174 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ba30ed9b-bec0-4977-ad91-91128d2d7636-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2qt5c\" (UID: \"ba30ed9b-bec0-4977-ad91-91128d2d7636\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360194 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e774e15d-2c99-453e-9c78-4fde0bf037fc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360298 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq4zp\" (UniqueName: \"kubernetes.io/projected/b5078140-21f5-4159-96af-69a29c27ec36-kube-api-access-sq4zp\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360420 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/28f4c688-bb4b-4268-b2f5-21739872a26d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-n7w49\" (UID: \"28f4c688-bb4b-4268-b2f5-21739872a26d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360465 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e-srv-cert\") pod \"catalog-operator-68c6474976-zhvrm\" (UID: \"73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360501 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-bound-sa-token\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360558 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k48zg\" (UniqueName: \"kubernetes.io/projected/ba30ed9b-bec0-4977-ad91-91128d2d7636-kube-api-access-k48zg\") pod \"machine-config-controller-84d6567774-2qt5c\" (UID: \"ba30ed9b-bec0-4977-ad91-91128d2d7636\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360598 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/62a21827-2dc4-47ee-88b8-af5721a23829-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7ftnn\" (UID: \"62a21827-2dc4-47ee-88b8-af5721a23829\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7ftnn" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360624 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b5078140-21f5-4159-96af-69a29c27ec36-csi-data-dir\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360656 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frz7n\" (UniqueName: \"kubernetes.io/projected/aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b-kube-api-access-frz7n\") pod \"service-ca-operator-777779d784-ppx9x\" (UID: \"aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360685 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t882\" (UniqueName: \"kubernetes.io/projected/852830c1-9fd1-4c23-807a-fef5c5934c82-kube-api-access-9t882\") pod \"router-default-5444994796-mnbl6\" (UID: \"852830c1-9fd1-4c23-807a-fef5c5934c82\") " pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360716 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e774e15d-2c99-453e-9c78-4fde0bf037fc-trusted-ca\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360796 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/179fdaf2-ddd5-4d07-acfc-44f510ed76c7-node-bootstrap-token\") pod \"machine-config-server-lttn7\" (UID: \"179fdaf2-ddd5-4d07-acfc-44f510ed76c7\") " pod="openshift-machine-config-operator/machine-config-server-lttn7" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360822 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z5vq\" (UniqueName: \"kubernetes.io/projected/28f4c688-bb4b-4268-b2f5-21739872a26d-kube-api-access-4z5vq\") pod \"package-server-manager-789f6589d5-n7w49\" (UID: \"28f4c688-bb4b-4268-b2f5-21739872a26d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360848 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/852830c1-9fd1-4c23-807a-fef5c5934c82-stats-auth\") pod \"router-default-5444994796-mnbl6\" (UID: \"852830c1-9fd1-4c23-807a-fef5c5934c82\") " pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360869 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba30ed9b-bec0-4977-ad91-91128d2d7636-proxy-tls\") pod \"machine-config-controller-84d6567774-2qt5c\" (UID: \"ba30ed9b-bec0-4977-ad91-91128d2d7636\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360896 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3675753d-4d26-4ec5-9abd-4e6f8966f56a-metrics-tls\") pod \"dns-default-jv8nd\" (UID: \"3675753d-4d26-4ec5-9abd-4e6f8966f56a\") " pod="openshift-dns/dns-default-jv8nd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360907 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e774e15d-2c99-453e-9c78-4fde0bf037fc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360956 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b-config\") pod \"service-ca-operator-777779d784-ppx9x\" (UID: \"aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.360991 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b5078140-21f5-4159-96af-69a29c27ec36-registration-dir\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361033 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsqzp\" (UniqueName: \"kubernetes.io/projected/4c700629-9c94-4e16-b584-e28f2103e68e-kube-api-access-zsqzp\") pod \"machine-config-operator-74547568cd-wlvvb\" (UID: \"4c700629-9c94-4e16-b584-e28f2103e68e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361058 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b5078140-21f5-4159-96af-69a29c27ec36-socket-dir\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361098 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e774e15d-2c99-453e-9c78-4fde0bf037fc-registry-certificates\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361131 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-secret-volume\") pod \"collect-profiles-29421330-9k5cb\" (UID: \"ecd48ed5-6e8d-40f5-b3d1-1254df80a033\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361213 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/852830c1-9fd1-4c23-807a-fef5c5934c82-service-ca-bundle\") pod \"router-default-5444994796-mnbl6\" (UID: \"852830c1-9fd1-4c23-807a-fef5c5934c82\") " pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361238 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nppfc\" (UniqueName: \"kubernetes.io/projected/321f0b7a-1cf4-4814-81d3-fe25fd718555-kube-api-access-nppfc\") pod \"packageserver-d55dfcdfc-72wx4\" (UID: \"321f0b7a-1cf4-4814-81d3-fe25fd718555\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361279 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e774e15d-2c99-453e-9c78-4fde0bf037fc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361361 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c700629-9c94-4e16-b584-e28f2103e68e-proxy-tls\") pod \"machine-config-operator-74547568cd-wlvvb\" (UID: \"4c700629-9c94-4e16-b584-e28f2103e68e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361387 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e697eb94-a732-4bb6-90c2-cd97e857b60b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nnh4w\" (UID: \"e697eb94-a732-4bb6-90c2-cd97e857b60b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nnh4w" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361423 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b5078140-21f5-4159-96af-69a29c27ec36-plugins-dir\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361462 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krf6l\" (UniqueName: \"kubernetes.io/projected/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-kube-api-access-krf6l\") pod \"collect-profiles-29421330-9k5cb\" (UID: \"ecd48ed5-6e8d-40f5-b3d1-1254df80a033\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361483 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h94tv\" (UniqueName: \"kubernetes.io/projected/3675753d-4d26-4ec5-9abd-4e6f8966f56a-kube-api-access-h94tv\") pod \"dns-default-jv8nd\" (UID: \"3675753d-4d26-4ec5-9abd-4e6f8966f56a\") " pod="openshift-dns/dns-default-jv8nd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361528 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-registry-tls\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361570 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5jpq\" (UniqueName: \"kubernetes.io/projected/179fdaf2-ddd5-4d07-acfc-44f510ed76c7-kube-api-access-g5jpq\") pod \"machine-config-server-lttn7\" (UID: \"179fdaf2-ddd5-4d07-acfc-44f510ed76c7\") " pod="openshift-machine-config-operator/machine-config-server-lttn7" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361593 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/321f0b7a-1cf4-4814-81d3-fe25fd718555-tmpfs\") pod \"packageserver-d55dfcdfc-72wx4\" (UID: \"321f0b7a-1cf4-4814-81d3-fe25fd718555\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361631 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl75f\" (UniqueName: \"kubernetes.io/projected/e697eb94-a732-4bb6-90c2-cd97e857b60b-kube-api-access-hl75f\") pod \"control-plane-machine-set-operator-78cbb6b69f-nnh4w\" (UID: \"e697eb94-a732-4bb6-90c2-cd97e857b60b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nnh4w" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361654 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tws4w\" (UniqueName: \"kubernetes.io/projected/62a21827-2dc4-47ee-88b8-af5721a23829-kube-api-access-tws4w\") pod \"multus-admission-controller-857f4d67dd-7ftnn\" (UID: \"62a21827-2dc4-47ee-88b8-af5721a23829\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7ftnn" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361697 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-config-volume\") pod \"collect-profiles-29421330-9k5cb\" (UID: \"ecd48ed5-6e8d-40f5-b3d1-1254df80a033\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361750 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc24512-e828-4d9c-acde-b03f888e9474-cert\") pod \"ingress-canary-p4p6k\" (UID: \"5fc24512-e828-4d9c-acde-b03f888e9474\") " pod="openshift-ingress-canary/ingress-canary-p4p6k" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361771 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/179fdaf2-ddd5-4d07-acfc-44f510ed76c7-certs\") pod \"machine-config-server-lttn7\" (UID: \"179fdaf2-ddd5-4d07-acfc-44f510ed76c7\") " pod="openshift-machine-config-operator/machine-config-server-lttn7" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361794 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4c700629-9c94-4e16-b584-e28f2103e68e-images\") pod \"machine-config-operator-74547568cd-wlvvb\" (UID: \"4c700629-9c94-4e16-b584-e28f2103e68e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361816 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c700629-9c94-4e16-b584-e28f2103e68e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wlvvb\" (UID: \"4c700629-9c94-4e16-b584-e28f2103e68e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.361870 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b-config\") pod \"service-ca-operator-777779d784-ppx9x\" (UID: \"aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.364409 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/321f0b7a-1cf4-4814-81d3-fe25fd718555-tmpfs\") pod \"packageserver-d55dfcdfc-72wx4\" (UID: \"321f0b7a-1cf4-4814-81d3-fe25fd718555\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.366261 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4c700629-9c94-4e16-b584-e28f2103e68e-images\") pod \"machine-config-operator-74547568cd-wlvvb\" (UID: \"4c700629-9c94-4e16-b584-e28f2103e68e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.367637 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e774e15d-2c99-453e-9c78-4fde0bf037fc-trusted-ca\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.368362 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/852830c1-9fd1-4c23-807a-fef5c5934c82-service-ca-bundle\") pod \"router-default-5444994796-mnbl6\" (UID: \"852830c1-9fd1-4c23-807a-fef5c5934c82\") " pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.368448 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/852830c1-9fd1-4c23-807a-fef5c5934c82-metrics-certs\") pod \"router-default-5444994796-mnbl6\" (UID: \"852830c1-9fd1-4c23-807a-fef5c5934c82\") " pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.369245 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-registry-tls\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.370459 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/321f0b7a-1cf4-4814-81d3-fe25fd718555-webhook-cert\") pod \"packageserver-d55dfcdfc-72wx4\" (UID: \"321f0b7a-1cf4-4814-81d3-fe25fd718555\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.371023 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b-serving-cert\") pod \"service-ca-operator-777779d784-ppx9x\" (UID: \"aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.371042 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e774e15d-2c99-453e-9c78-4fde0bf037fc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.371442 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/321f0b7a-1cf4-4814-81d3-fe25fd718555-apiservice-cert\") pod \"packageserver-d55dfcdfc-72wx4\" (UID: \"321f0b7a-1cf4-4814-81d3-fe25fd718555\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.371842 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba30ed9b-bec0-4977-ad91-91128d2d7636-proxy-tls\") pod \"machine-config-controller-84d6567774-2qt5c\" (UID: \"ba30ed9b-bec0-4977-ad91-91128d2d7636\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.371958 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/852830c1-9fd1-4c23-807a-fef5c5934c82-default-certificate\") pod \"router-default-5444994796-mnbl6\" (UID: \"852830c1-9fd1-4c23-807a-fef5c5934c82\") " pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.372456 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e774e15d-2c99-453e-9c78-4fde0bf037fc-registry-certificates\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.373468 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-config-volume\") pod \"collect-profiles-29421330-9k5cb\" (UID: \"ecd48ed5-6e8d-40f5-b3d1-1254df80a033\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.375010 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/179fdaf2-ddd5-4d07-acfc-44f510ed76c7-certs\") pod \"machine-config-server-lttn7\" (UID: \"179fdaf2-ddd5-4d07-acfc-44f510ed76c7\") " pod="openshift-machine-config-operator/machine-config-server-lttn7" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.379623 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/179fdaf2-ddd5-4d07-acfc-44f510ed76c7-node-bootstrap-token\") pod \"machine-config-server-lttn7\" (UID: \"179fdaf2-ddd5-4d07-acfc-44f510ed76c7\") " pod="openshift-machine-config-operator/machine-config-server-lttn7" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.380192 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e-srv-cert\") pod \"catalog-operator-68c6474976-zhvrm\" (UID: \"73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.382396 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c700629-9c94-4e16-b584-e28f2103e68e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wlvvb\" (UID: \"4c700629-9c94-4e16-b584-e28f2103e68e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.384194 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/62a21827-2dc4-47ee-88b8-af5721a23829-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7ftnn\" (UID: \"62a21827-2dc4-47ee-88b8-af5721a23829\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7ftnn" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.384486 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c700629-9c94-4e16-b584-e28f2103e68e-proxy-tls\") pod \"machine-config-operator-74547568cd-wlvvb\" (UID: \"4c700629-9c94-4e16-b584-e28f2103e68e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.384917 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/852830c1-9fd1-4c23-807a-fef5c5934c82-stats-auth\") pod \"router-default-5444994796-mnbl6\" (UID: \"852830c1-9fd1-4c23-807a-fef5c5934c82\") " pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.385369 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e697eb94-a732-4bb6-90c2-cd97e857b60b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nnh4w\" (UID: \"e697eb94-a732-4bb6-90c2-cd97e857b60b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nnh4w" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.385628 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-secret-volume\") pod \"collect-profiles-29421330-9k5cb\" (UID: \"ecd48ed5-6e8d-40f5-b3d1-1254df80a033\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.385781 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/28f4c688-bb4b-4268-b2f5-21739872a26d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-n7w49\" (UID: \"28f4c688-bb4b-4268-b2f5-21739872a26d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.385981 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e-profile-collector-cert\") pod \"catalog-operator-68c6474976-zhvrm\" (UID: \"73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.387951 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hj9bn"] Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.392887 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgm6l\" (UniqueName: \"kubernetes.io/projected/5146ef22-be05-4272-bfa7-80452a0a908f-kube-api-access-dgm6l\") pod \"migrator-59844c95c7-nrkr5\" (UID: \"5146ef22-be05-4272-bfa7-80452a0a908f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrkr5" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.410475 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzj4q\" (UniqueName: \"kubernetes.io/projected/73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e-kube-api-access-gzj4q\") pod \"catalog-operator-68c6474976-zhvrm\" (UID: \"73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.460713 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4cmw\" (UniqueName: \"kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-kube-api-access-j4cmw\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.465812 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:12 crc kubenswrapper[4745]: E1209 11:34:12.466093 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:12.966070806 +0000 UTC m=+139.791272340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.466251 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3675753d-4d26-4ec5-9abd-4e6f8966f56a-config-volume\") pod \"dns-default-jv8nd\" (UID: \"3675753d-4d26-4ec5-9abd-4e6f8966f56a\") " pod="openshift-dns/dns-default-jv8nd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.466280 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6brcw\" (UniqueName: \"kubernetes.io/projected/5fc24512-e828-4d9c-acde-b03f888e9474-kube-api-access-6brcw\") pod \"ingress-canary-p4p6k\" (UID: \"5fc24512-e828-4d9c-acde-b03f888e9474\") " pod="openshift-ingress-canary/ingress-canary-p4p6k" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.466306 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b5078140-21f5-4159-96af-69a29c27ec36-mountpoint-dir\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.466333 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq4zp\" (UniqueName: \"kubernetes.io/projected/b5078140-21f5-4159-96af-69a29c27ec36-kube-api-access-sq4zp\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.466385 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b5078140-21f5-4159-96af-69a29c27ec36-csi-data-dir\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.466420 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3675753d-4d26-4ec5-9abd-4e6f8966f56a-metrics-tls\") pod \"dns-default-jv8nd\" (UID: \"3675753d-4d26-4ec5-9abd-4e6f8966f56a\") " pod="openshift-dns/dns-default-jv8nd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.466441 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b5078140-21f5-4159-96af-69a29c27ec36-socket-dir\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.466459 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b5078140-21f5-4159-96af-69a29c27ec36-registration-dir\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.466502 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b5078140-21f5-4159-96af-69a29c27ec36-plugins-dir\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.466554 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h94tv\" (UniqueName: \"kubernetes.io/projected/3675753d-4d26-4ec5-9abd-4e6f8966f56a-kube-api-access-h94tv\") pod \"dns-default-jv8nd\" (UID: \"3675753d-4d26-4ec5-9abd-4e6f8966f56a\") " pod="openshift-dns/dns-default-jv8nd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.466642 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc24512-e828-4d9c-acde-b03f888e9474-cert\") pod \"ingress-canary-p4p6k\" (UID: \"5fc24512-e828-4d9c-acde-b03f888e9474\") " pod="openshift-ingress-canary/ingress-canary-p4p6k" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.466672 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: E1209 11:34:12.466954 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:12.96694492 +0000 UTC m=+139.792146454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.468212 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b5078140-21f5-4159-96af-69a29c27ec36-plugins-dir\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.468225 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b5078140-21f5-4159-96af-69a29c27ec36-socket-dir\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.468286 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b5078140-21f5-4159-96af-69a29c27ec36-registration-dir\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.468300 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b5078140-21f5-4159-96af-69a29c27ec36-csi-data-dir\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.468646 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b5078140-21f5-4159-96af-69a29c27ec36-mountpoint-dir\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.469019 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3675753d-4d26-4ec5-9abd-4e6f8966f56a-config-volume\") pod \"dns-default-jv8nd\" (UID: \"3675753d-4d26-4ec5-9abd-4e6f8966f56a\") " pod="openshift-dns/dns-default-jv8nd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.476078 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc24512-e828-4d9c-acde-b03f888e9474-cert\") pod \"ingress-canary-p4p6k\" (UID: \"5fc24512-e828-4d9c-acde-b03f888e9474\") " pod="openshift-ingress-canary/ingress-canary-p4p6k" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.476549 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frz7n\" (UniqueName: \"kubernetes.io/projected/aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b-kube-api-access-frz7n\") pod \"service-ca-operator-777779d784-ppx9x\" (UID: \"aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.484910 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kjsvk"] Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.489345 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3675753d-4d26-4ec5-9abd-4e6f8966f56a-metrics-tls\") pod \"dns-default-jv8nd\" (UID: \"3675753d-4d26-4ec5-9abd-4e6f8966f56a\") " pod="openshift-dns/dns-default-jv8nd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.495347 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krf6l\" (UniqueName: \"kubernetes.io/projected/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-kube-api-access-krf6l\") pod \"collect-profiles-29421330-9k5cb\" (UID: \"ecd48ed5-6e8d-40f5-b3d1-1254df80a033\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.498668 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4"] Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.524096 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k48zg\" (UniqueName: \"kubernetes.io/projected/ba30ed9b-bec0-4977-ad91-91128d2d7636-kube-api-access-k48zg\") pod \"machine-config-controller-84d6567774-2qt5c\" (UID: \"ba30ed9b-bec0-4977-ad91-91128d2d7636\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.524357 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrkr5" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.540963 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-bound-sa-token\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.549341 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.553071 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5jpq\" (UniqueName: \"kubernetes.io/projected/179fdaf2-ddd5-4d07-acfc-44f510ed76c7-kube-api-access-g5jpq\") pod \"machine-config-server-lttn7\" (UID: \"179fdaf2-ddd5-4d07-acfc-44f510ed76c7\") " pod="openshift-machine-config-operator/machine-config-server-lttn7" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.555693 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.568001 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:12 crc kubenswrapper[4745]: E1209 11:34:12.568564 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:13.068543473 +0000 UTC m=+139.893744997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.571285 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl75f\" (UniqueName: \"kubernetes.io/projected/e697eb94-a732-4bb6-90c2-cd97e857b60b-kube-api-access-hl75f\") pod \"control-plane-machine-set-operator-78cbb6b69f-nnh4w\" (UID: \"e697eb94-a732-4bb6-90c2-cd97e857b60b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nnh4w" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.591825 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.607601 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.622963 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-lttn7" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.630458 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tws4w\" (UniqueName: \"kubernetes.io/projected/62a21827-2dc4-47ee-88b8-af5721a23829-kube-api-access-tws4w\") pod \"multus-admission-controller-857f4d67dd-7ftnn\" (UID: \"62a21827-2dc4-47ee-88b8-af5721a23829\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7ftnn" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.647124 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t882\" (UniqueName: \"kubernetes.io/projected/852830c1-9fd1-4c23-807a-fef5c5934c82-kube-api-access-9t882\") pod \"router-default-5444994796-mnbl6\" (UID: \"852830c1-9fd1-4c23-807a-fef5c5934c82\") " pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.647976 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nppfc\" (UniqueName: \"kubernetes.io/projected/321f0b7a-1cf4-4814-81d3-fe25fd718555-kube-api-access-nppfc\") pod \"packageserver-d55dfcdfc-72wx4\" (UID: \"321f0b7a-1cf4-4814-81d3-fe25fd718555\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.669536 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: E1209 11:34:12.669918 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:13.169903629 +0000 UTC m=+139.995105153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.683782 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsqzp\" (UniqueName: \"kubernetes.io/projected/4c700629-9c94-4e16-b584-e28f2103e68e-kube-api-access-zsqzp\") pod \"machine-config-operator-74547568cd-wlvvb\" (UID: \"4c700629-9c94-4e16-b584-e28f2103e68e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.685264 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z5vq\" (UniqueName: \"kubernetes.io/projected/28f4c688-bb4b-4268-b2f5-21739872a26d-kube-api-access-4z5vq\") pod \"package-server-manager-789f6589d5-n7w49\" (UID: \"28f4c688-bb4b-4268-b2f5-21739872a26d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.715901 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h94tv\" (UniqueName: \"kubernetes.io/projected/3675753d-4d26-4ec5-9abd-4e6f8966f56a-kube-api-access-h94tv\") pod \"dns-default-jv8nd\" (UID: \"3675753d-4d26-4ec5-9abd-4e6f8966f56a\") " pod="openshift-dns/dns-default-jv8nd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.770846 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:12 crc kubenswrapper[4745]: E1209 11:34:12.770945 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:13.270929056 +0000 UTC m=+140.096130570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.771180 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: E1209 11:34:12.771496 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:13.271485822 +0000 UTC m=+140.096687346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.772398 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq4zp\" (UniqueName: \"kubernetes.io/projected/b5078140-21f5-4159-96af-69a29c27ec36-kube-api-access-sq4zp\") pod \"csi-hostpathplugin-r79qf\" (UID: \"b5078140-21f5-4159-96af-69a29c27ec36\") " pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.792437 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6brcw\" (UniqueName: \"kubernetes.io/projected/5fc24512-e828-4d9c-acde-b03f888e9474-kube-api-access-6brcw\") pod \"ingress-canary-p4p6k\" (UID: \"5fc24512-e828-4d9c-acde-b03f888e9474\") " pod="openshift-ingress-canary/ingress-canary-p4p6k" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.830203 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.836626 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nnh4w" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.845476 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.872128 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.872830 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:12 crc kubenswrapper[4745]: E1209 11:34:12.872986 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:13.372957881 +0000 UTC m=+140.198159405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.873185 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:12 crc kubenswrapper[4745]: E1209 11:34:12.873640 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:13.37363193 +0000 UTC m=+140.198833454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.900259 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.916243 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7ftnn" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.932957 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jv8nd" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.941719 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p4p6k" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.966241 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-r79qf" Dec 09 11:34:12 crc kubenswrapper[4745]: I1209 11:34:12.974631 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:12 crc kubenswrapper[4745]: E1209 11:34:12.974928 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:13.474914243 +0000 UTC m=+140.300115767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.075889 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:13 crc kubenswrapper[4745]: E1209 11:34:13.076198 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:13.576187717 +0000 UTC m=+140.401389241 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.129217 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8r64w"] Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.140091 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr"] Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.140141 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr"] Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.145056 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hk5ns"] Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.177055 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:13 crc kubenswrapper[4745]: E1209 11:34:13.177411 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:13.677386799 +0000 UTC m=+140.502588323 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.206191 4745 generic.go:334] "Generic (PLEG): container finished" podID="c500e9ec-eac6-45e4-bb6d-209e92ffbdad" containerID="07dfbac15716637f6a9bc76d1f3f1ca8852b0d722e300849a3340b8758a5d98f" exitCode=0 Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.206231 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" event={"ID":"c500e9ec-eac6-45e4-bb6d-209e92ffbdad","Type":"ContainerDied","Data":"07dfbac15716637f6a9bc76d1f3f1ca8852b0d722e300849a3340b8758a5d98f"} Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.211366 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" event={"ID":"6cfa061d-49b8-4640-ae8d-674ef0832ef7","Type":"ContainerStarted","Data":"26e7757bf9f67a097fbd74cc86f6930cfeb91f98f762fbdfb3beeeb68f7a9a52"} Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.211405 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" event={"ID":"6cfa061d-49b8-4640-ae8d-674ef0832ef7","Type":"ContainerStarted","Data":"4bc2151f7fc44cac8584d90132475f6f75a8fafabd1bee4d597c7cdd5e038173"} Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.213117 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.216065 4745 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-wjx47 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.216107 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" podUID="6cfa061d-49b8-4640-ae8d-674ef0832ef7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.220532 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rbfb" event={"ID":"5c3ac7c1-b7a4-4a1a-9279-723fe4e26a04","Type":"ContainerStarted","Data":"5d7c46abfc4516c1482854d8dbdb49974a388be2bd3172aa79530b857ff87d13"} Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.220580 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rbfb" event={"ID":"5c3ac7c1-b7a4-4a1a-9279-723fe4e26a04","Type":"ContainerStarted","Data":"47b899736165ec604a47ad93cebbd1b32057f8d97fac371ce59c57f81ef7d731"} Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.228297 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" event={"ID":"33da92e8-30d5-47b4-9d6a-496d4d1d1306","Type":"ContainerStarted","Data":"3b98fccb4073faf09b4fe460077d38d1cf17bc4c97dceedabd8d32ec8dd04a94"} Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.228335 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" event={"ID":"33da92e8-30d5-47b4-9d6a-496d4d1d1306","Type":"ContainerStarted","Data":"2c6cda19daa7788410c91f9af171858009d9022cdfb94d42168c7bac7ffdcfbb"} Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.229749 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mnbl6" event={"ID":"852830c1-9fd1-4c23-807a-fef5c5934c82","Type":"ContainerStarted","Data":"7f641c70da481fb3873788957d97fc013c33aeeaa4307faa63484c5905f96791"} Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.230753 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-lttn7" event={"ID":"179fdaf2-ddd5-4d07-acfc-44f510ed76c7","Type":"ContainerStarted","Data":"261359ddb02f56f1a3b06e58eb985be8153e7a56ed4211bf78af99215b0dde80"} Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.232183 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" event={"ID":"fa5c85c2-26a5-44d3-a759-080eb6198c6d","Type":"ContainerStarted","Data":"a873825b7250c4e62eb4f3f66d79ea3bc0fdac8ca4858102c1dafa09a58eda52"} Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.232202 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" event={"ID":"fa5c85c2-26a5-44d3-a759-080eb6198c6d","Type":"ContainerStarted","Data":"367a14a5405ff68733e157dfcb2a97d90f270c21f7d6a7a32bd7f6d0cdb957a8"} Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.233881 4745 generic.go:334] "Generic (PLEG): container finished" podID="50fd29d0-8021-4cf0-ad83-dc5c679aeb43" containerID="fd0b6953a23302141a81dacacf4f0a24363c2092a9b2598412e27ba2904e9d92" exitCode=0 Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.233963 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" event={"ID":"50fd29d0-8021-4cf0-ad83-dc5c679aeb43","Type":"ContainerDied","Data":"fd0b6953a23302141a81dacacf4f0a24363c2092a9b2598412e27ba2904e9d92"} Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.234909 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" event={"ID":"8a071b7a-930e-46f6-91d3-3aefcacf5eec","Type":"ContainerStarted","Data":"0e68a209f13bb22c85e7e6fe057d14d77b7f504e525dc1a56fcbde9439f4f296"} Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.236137 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4" event={"ID":"7b8d6488-91ee-4d61-a22d-377b3d2c8aab","Type":"ContainerStarted","Data":"d2a8c12036ec2595077f49f3d3b63c51b303c8caf1c04ca529d7dcfd558a8dc5"} Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.236165 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4" event={"ID":"7b8d6488-91ee-4d61-a22d-377b3d2c8aab","Type":"ContainerStarted","Data":"4f8077a170a1febacf8bef63217a4280f0b62d20dc8b0cebea7a8bc983f30102"} Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.252073 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kjsvk" event={"ID":"413895ff-f52a-401e-a1f4-67a2ee0bc52c","Type":"ContainerStarted","Data":"5a47acd6278daae934901df75bcafa877e4a656a90fe0f0f90b7dcd183e691a8"} Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.278387 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:13 crc kubenswrapper[4745]: E1209 11:34:13.278811 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:13.778798157 +0000 UTC m=+140.603999681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:13 crc kubenswrapper[4745]: W1209 11:34:13.337105 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd02382a6_7e0e_4274_bbe2_e713ac39756c.slice/crio-1d8c172f004d1bef761b5ab3b7b5e786dc336f859c3c030d8fa1ec2ba5cd7101 WatchSource:0}: Error finding container 1d8c172f004d1bef761b5ab3b7b5e786dc336f859c3c030d8fa1ec2ba5cd7101: Status 404 returned error can't find the container with id 1d8c172f004d1bef761b5ab3b7b5e786dc336f859c3c030d8fa1ec2ba5cd7101 Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.380057 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:13 crc kubenswrapper[4745]: E1209 11:34:13.381286 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:13.881265003 +0000 UTC m=+140.706466527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:13 crc kubenswrapper[4745]: W1209 11:34:13.407725 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52737279_7072_4bdb_9e9b_8e8cd58d44c9.slice/crio-6411c8defe99fe6253eac054fa9b69f5d3f0c57ac3129ba0cf8466359b670d4c WatchSource:0}: Error finding container 6411c8defe99fe6253eac054fa9b69f5d3f0c57ac3129ba0cf8466359b670d4c: Status 404 returned error can't find the container with id 6411c8defe99fe6253eac054fa9b69f5d3f0c57ac3129ba0cf8466359b670d4c Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.486368 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:13 crc kubenswrapper[4745]: E1209 11:34:13.487552 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:13.987537596 +0000 UTC m=+140.812739120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.587978 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:13 crc kubenswrapper[4745]: E1209 11:34:13.588338 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:14.088319937 +0000 UTC m=+140.913521461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.595412 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" podStartSLOduration=120.595392493 podStartE2EDuration="2m0.595392493s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:13.553934961 +0000 UTC m=+140.379136485" watchObservedRunningTime="2025-12-09 11:34:13.595392493 +0000 UTC m=+140.420594017" Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.689154 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:13 crc kubenswrapper[4745]: E1209 11:34:13.689438 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:14.189427366 +0000 UTC m=+141.014628880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.690020 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z"] Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.784817 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-zkn7h" podStartSLOduration=120.784799436 podStartE2EDuration="2m0.784799436s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:13.783576332 +0000 UTC m=+140.608777866" watchObservedRunningTime="2025-12-09 11:34:13.784799436 +0000 UTC m=+140.610000970" Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.789870 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:13 crc kubenswrapper[4745]: E1209 11:34:13.790218 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:14.290202426 +0000 UTC m=+141.115403950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.878836 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vwmfn"] Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.888054 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct"] Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.892489 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:13 crc kubenswrapper[4745]: E1209 11:34:13.892885 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:14.392872789 +0000 UTC m=+141.218074313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.903908 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.913225 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p57l4" podStartSLOduration=120.913208924 podStartE2EDuration="2m0.913208924s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:13.903658979 +0000 UTC m=+140.728860513" watchObservedRunningTime="2025-12-09 11:34:13.913208924 +0000 UTC m=+140.738410448" Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.915271 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c87ls"] Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.937347 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4"] Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.950951 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc"] Dec 09 11:34:13 crc kubenswrapper[4745]: I1209 11:34:13.959322 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vxndp"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:13.995666 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:14 crc kubenswrapper[4745]: E1209 11:34:13.996787 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:14.496756715 +0000 UTC m=+141.321958239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:14 crc kubenswrapper[4745]: W1209 11:34:14.081974 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38d93c0b_6422_4a07_98dd_5a9eacdc1fa3.slice/crio-627cee6c4c4728c3723d0344016f0fcb27fa85063e17a61c74e7a13055fe7a6b WatchSource:0}: Error finding container 627cee6c4c4728c3723d0344016f0fcb27fa85063e17a61c74e7a13055fe7a6b: Status 404 returned error can't find the container with id 627cee6c4c4728c3723d0344016f0fcb27fa85063e17a61c74e7a13055fe7a6b Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.097613 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:14 crc kubenswrapper[4745]: E1209 11:34:14.097975 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:14.597963547 +0000 UTC m=+141.423165071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.107073 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w5n22"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.115467 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-mcrk5"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.161299 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.179758 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-j258s"] Dec 09 11:34:14 crc kubenswrapper[4745]: W1209 11:34:14.180959 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c51afdf_fde4_4147_814c_8befb1ad7d1f.slice/crio-24d3d49c7a056eaf925cd7d7b7ae7acb5bd88131679859401a3faa6c7e8a50e1 WatchSource:0}: Error finding container 24d3d49c7a056eaf925cd7d7b7ae7acb5bd88131679859401a3faa6c7e8a50e1: Status 404 returned error can't find the container with id 24d3d49c7a056eaf925cd7d7b7ae7acb5bd88131679859401a3faa6c7e8a50e1 Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.184841 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nrkr5"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.195709 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.198536 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:14 crc kubenswrapper[4745]: E1209 11:34:14.198782 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:14.698752257 +0000 UTC m=+141.523953791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.198864 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:14 crc kubenswrapper[4745]: E1209 11:34:14.199211 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:14.69920097 +0000 UTC m=+141.524402504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:14 crc kubenswrapper[4745]: W1209 11:34:14.208880 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5372cf2_1aac_4a33_96ae_f6e7e612195a.slice/crio-3d2d0e1f33e5938d57965e9d7feae0e85b852eb8300a7556a07a4ee9b90054ee WatchSource:0}: Error finding container 3d2d0e1f33e5938d57965e9d7feae0e85b852eb8300a7556a07a4ee9b90054ee: Status 404 returned error can't find the container with id 3d2d0e1f33e5938d57965e9d7feae0e85b852eb8300a7556a07a4ee9b90054ee Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.217488 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.300111 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:14 crc kubenswrapper[4745]: E1209 11:34:14.300788 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:14.800752331 +0000 UTC m=+141.625953855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.302161 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" event={"ID":"50fd29d0-8021-4cf0-ad83-dc5c679aeb43","Type":"ContainerStarted","Data":"ad1682006e24c065991d8d8f89dfffb2f9ab453b86099ad3c7baf29443f74c1f"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.302374 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.305759 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mnbl6" event={"ID":"852830c1-9fd1-4c23-807a-fef5c5934c82","Type":"ContainerStarted","Data":"6b664d229e6f13a636f15d448d53991ee7b1eb1ecc9c40c0775a38dacd24f083"} Dec 09 11:34:14 crc kubenswrapper[4745]: W1209 11:34:14.321580 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb47373e1_3530_4cfd_a139_02674d232523.slice/crio-ee57f7d1f67bbe4694faae255ecc709c30a980b1dff1c92d985be98e4edd23ae WatchSource:0}: Error finding container ee57f7d1f67bbe4694faae255ecc709c30a980b1dff1c92d985be98e4edd23ae: Status 404 returned error can't find the container with id ee57f7d1f67bbe4694faae255ecc709c30a980b1dff1c92d985be98e4edd23ae Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.331337 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.331740 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-lttn7" event={"ID":"179fdaf2-ddd5-4d07-acfc-44f510ed76c7","Type":"ContainerStarted","Data":"98c0c208edf0bcc934ef19a66947e2c5250e60e1afa7aaede91c9971ee1b6b7c"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.333318 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.359420 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr" event={"ID":"69e84e12-ef0d-46b5-8d5b-fde7b040e11b","Type":"ContainerStarted","Data":"2764896b4f7516f873f16dd78e2d408f50aea4d45b3fb277f9194971d17e06d1"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.389470 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc" event={"ID":"38d93c0b-6422-4a07-98dd-5a9eacdc1fa3","Type":"ContainerStarted","Data":"627cee6c4c4728c3723d0344016f0fcb27fa85063e17a61c74e7a13055fe7a6b"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.390582 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" event={"ID":"52737279-7072-4bdb-9e9b-8e8cd58d44c9","Type":"ContainerStarted","Data":"5a4377b190a42a46430e4bad99280e41c767b18dfb1bb2c304f6ef3c2905fdf1"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.390603 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" event={"ID":"52737279-7072-4bdb-9e9b-8e8cd58d44c9","Type":"ContainerStarted","Data":"6411c8defe99fe6253eac054fa9b69f5d3f0c57ac3129ba0cf8466359b670d4c"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.391438 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" event={"ID":"cfc45955-ae9e-460c-97df-1d23f960c862","Type":"ContainerStarted","Data":"96917e5106cd481419acb013c7325cd86cb5a189def894e2b000610a5d38998e"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.392595 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" event={"ID":"4eb738d6-4a1b-4790-9376-0f416784bf8e","Type":"ContainerStarted","Data":"160eaffec8aaa27240b649cde30487792d2bdb549e5d44422e8b87f8ec4cee05"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.402315 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vxndp" event={"ID":"1295af6d-c037-4fa4-adfb-1d43919d86ed","Type":"ContainerStarted","Data":"eee35efa3711e870bdf8ec5647d5089f95e886349d47fa63bef16fe9f6f2f60d"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.405642 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:14 crc kubenswrapper[4745]: E1209 11:34:14.407265 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:14.907250741 +0000 UTC m=+141.732452255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.418235 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.425258 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" event={"ID":"c500e9ec-eac6-45e4-bb6d-209e92ffbdad","Type":"ContainerStarted","Data":"955324b47af669319f88ff9ef63df01c0752a08cec213046ba935b799814508a"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.430415 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4" event={"ID":"a41a4525-7677-4db8-b040-d4c1edfcc9a0","Type":"ContainerStarted","Data":"463b9bdea4b98b8d4bc903f8453c1fdc6959429b42de634d6a967b71ab9c20f1"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.438113 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" event={"ID":"e5372cf2-1aac-4a33-96ae-f6e7e612195a","Type":"ContainerStarted","Data":"3d2d0e1f33e5938d57965e9d7feae0e85b852eb8300a7556a07a4ee9b90054ee"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.439251 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mcrk5" event={"ID":"8c51afdf-fde4-4147-814c-8befb1ad7d1f","Type":"ContainerStarted","Data":"24d3d49c7a056eaf925cd7d7b7ae7acb5bd88131679859401a3faa6c7e8a50e1"} Dec 09 11:34:14 crc kubenswrapper[4745]: W1209 11:34:14.441646 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaf1dc1d_aae4_4a14_bd7d_8d3fbd17752b.slice/crio-7905ea8d57d729262e13e12938e79003f70ad1daf3f092d3ef3de641fe8cd543 WatchSource:0}: Error finding container 7905ea8d57d729262e13e12938e79003f70ad1daf3f092d3ef3de641fe8cd543: Status 404 returned error can't find the container with id 7905ea8d57d729262e13e12938e79003f70ad1daf3f092d3ef3de641fe8cd543 Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.441960 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" event={"ID":"8a071b7a-930e-46f6-91d3-3aefcacf5eec","Type":"ContainerStarted","Data":"948f0f3a865ea81ff3fbb7e36411a6d1f74f0dcff4784886f7d87f39432f1fa3"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.443681 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-j258s" event={"ID":"3824312f-b03f-42da-890e-53a61841a8b0","Type":"ContainerStarted","Data":"73a3b538c73a95dcfd9c6e09f4433724b3baff82660b0e862a2346b0d2caaa4e"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.450928 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kjsvk" event={"ID":"413895ff-f52a-401e-a1f4-67a2ee0bc52c","Type":"ContainerStarted","Data":"4f1a4e945cc3a51bf8e15199244bbdd717724857fd8fab74982703d19c5e8c39"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.450972 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-kjsvk" Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.457257 4745 patch_prober.go:28] interesting pod/console-operator-58897d9998-kjsvk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.457313 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-kjsvk" podUID="413895ff-f52a-401e-a1f4-67a2ee0bc52c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 09 11:34:14 crc kubenswrapper[4745]: W1209 11:34:14.476712 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73a4ef64_9cfd_4e37_b2a8_8d256f30ed9e.slice/crio-08fc02e1a33aca7407c262c98eb2f0c01d8cff0393b240772ec4945f553c2ce2 WatchSource:0}: Error finding container 08fc02e1a33aca7407c262c98eb2f0c01d8cff0393b240772ec4945f553c2ce2: Status 404 returned error can't find the container with id 08fc02e1a33aca7407c262c98eb2f0c01d8cff0393b240772ec4945f553c2ce2 Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.479968 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.486835 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c" event={"ID":"ba30ed9b-bec0-4977-ad91-91128d2d7636","Type":"ContainerStarted","Data":"22be587ee14d3b98ee7d2635eedf86bb263e29a6102998323656a27dd85eb967"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.492012 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rbfb" event={"ID":"5c3ac7c1-b7a4-4a1a-9279-723fe4e26a04","Type":"ContainerStarted","Data":"441961cb2e8bdde8623078a6bd1d1fbb2112a2e375c7f383f297ef72803d8bf2"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.499806 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hk5ns" event={"ID":"ea7071e0-9d7c-4b66-b404-01511502284c","Type":"ContainerStarted","Data":"b9d9e7c87afefd458c4467d27c0ba47a63bcb95b635a04bdd3103f51cdd2b5d5"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.500143 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hk5ns" event={"ID":"ea7071e0-9d7c-4b66-b404-01511502284c","Type":"ContainerStarted","Data":"5dbf91aa00b708a0a306d74801a66657f6d87a5404fc44efc2b4ad52b9430713"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.509647 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:14 crc kubenswrapper[4745]: E1209 11:34:14.509830 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:15.00981323 +0000 UTC m=+141.835014754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.509959 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:14 crc kubenswrapper[4745]: E1209 11:34:14.512309 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:15.012296649 +0000 UTC m=+141.837498173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.541735 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" event={"ID":"770be29f-5b58-4569-a4a8-0618adb2ed5c","Type":"ContainerStarted","Data":"1a788e0b74342b5b6ded2ce6cfc6bb090e78cbb9f0915e4144d72a8b86f3ab77"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.545078 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" event={"ID":"e0e69154-b28d-4d09-9fd7-3e28b08d20cd","Type":"ContainerStarted","Data":"5a09867ff7e8935efa43d877cafd39c0c3699634d21f11aa26cc72ca3074f79a"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.551779 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8r64w" event={"ID":"d02382a6-7e0e-4274-bbe2-e713ac39756c","Type":"ContainerStarted","Data":"17bb6bf6f4f53c1dae1dc0ed92331c352fde91492ad6c1ea2cbf88baddc39593"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.551843 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8r64w" event={"ID":"d02382a6-7e0e-4274-bbe2-e713ac39756c","Type":"ContainerStarted","Data":"1d8c172f004d1bef761b5ab3b7b5e786dc336f859c3c030d8fa1ec2ba5cd7101"} Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.551860 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.559777 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8r64w" Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.569007 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.569064 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.593735 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.596305 4745 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-hj9bn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.596346 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" podUID="fa5c85c2-26a5-44d3-a759-080eb6198c6d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.598859 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.601411 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-r79qf"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.614751 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:14 crc kubenswrapper[4745]: E1209 11:34:14.616838 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:15.114988953 +0000 UTC m=+141.940190477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.617262 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:14 crc kubenswrapper[4745]: E1209 11:34:14.623141 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:15.123127059 +0000 UTC m=+141.948328573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.628146 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.636889 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7ftnn"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.649662 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nnh4w"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.679804 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.720536 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" podStartSLOduration=121.720491364 podStartE2EDuration="2m1.720491364s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:14.690379187 +0000 UTC m=+141.515580701" watchObservedRunningTime="2025-12-09 11:34:14.720491364 +0000 UTC m=+141.545692888" Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.722795 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:14 crc kubenswrapper[4745]: E1209 11:34:14.724345 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:15.224324161 +0000 UTC m=+142.049525685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.740260 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jv8nd"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.750470 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p4p6k"] Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.770649 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-lttn7" podStartSLOduration=5.770632097 podStartE2EDuration="5.770632097s" podCreationTimestamp="2025-12-09 11:34:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:14.769614049 +0000 UTC m=+141.594815573" watchObservedRunningTime="2025-12-09 11:34:14.770632097 +0000 UTC m=+141.595833621" Dec 09 11:34:14 crc kubenswrapper[4745]: W1209 11:34:14.778209 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62a21827_2dc4_47ee_88b8_af5721a23829.slice/crio-7220029df81ff05ac99097dcb615fa9226cd0d6b6cc2557ca9899521eedae523 WatchSource:0}: Error finding container 7220029df81ff05ac99097dcb615fa9226cd0d6b6cc2557ca9899521eedae523: Status 404 returned error can't find the container with id 7220029df81ff05ac99097dcb615fa9226cd0d6b6cc2557ca9899521eedae523 Dec 09 11:34:14 crc kubenswrapper[4745]: W1209 11:34:14.781569 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28f4c688_bb4b_4268_b2f5_21739872a26d.slice/crio-f2ceb21d183f67bfd2f5d4abad842c90c9fbebe77ab2d53251ceee568c19dcc8 WatchSource:0}: Error finding container f2ceb21d183f67bfd2f5d4abad842c90c9fbebe77ab2d53251ceee568c19dcc8: Status 404 returned error can't find the container with id f2ceb21d183f67bfd2f5d4abad842c90c9fbebe77ab2d53251ceee568c19dcc8 Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.824635 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:14 crc kubenswrapper[4745]: E1209 11:34:14.824922 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:15.324910225 +0000 UTC m=+142.150111749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.902204 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.917188 4745 patch_prober.go:28] interesting pod/router-default-5444994796-mnbl6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 11:34:14 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Dec 09 11:34:14 crc kubenswrapper[4745]: [+]process-running ok Dec 09 11:34:14 crc kubenswrapper[4745]: healthz check failed Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.917262 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mnbl6" podUID="852830c1-9fd1-4c23-807a-fef5c5934c82" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 11:34:14 crc kubenswrapper[4745]: I1209 11:34:14.925367 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:14 crc kubenswrapper[4745]: E1209 11:34:14.925810 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:15.425790908 +0000 UTC m=+142.250992432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:14 crc kubenswrapper[4745]: W1209 11:34:14.937718 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fc24512_e828_4d9c_acde_b03f888e9474.slice/crio-94cc00e1f74ee1aa796a774b4fd7dbefc37858051bbde50d4880285eb1cc5fbf WatchSource:0}: Error finding container 94cc00e1f74ee1aa796a774b4fd7dbefc37858051bbde50d4880285eb1cc5fbf: Status 404 returned error can't find the container with id 94cc00e1f74ee1aa796a774b4fd7dbefc37858051bbde50d4880285eb1cc5fbf Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.026537 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:15 crc kubenswrapper[4745]: E1209 11:34:15.026908 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:15.526895318 +0000 UTC m=+142.352096842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.128155 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:15 crc kubenswrapper[4745]: E1209 11:34:15.128313 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:15.628286605 +0000 UTC m=+142.453488139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.128778 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:15 crc kubenswrapper[4745]: E1209 11:34:15.129023 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:15.629012755 +0000 UTC m=+142.454214279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.176342 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-mnbl6" podStartSLOduration=122.17632511 podStartE2EDuration="2m2.17632511s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:15.144154796 +0000 UTC m=+141.969356320" watchObservedRunningTime="2025-12-09 11:34:15.17632511 +0000 UTC m=+142.001526634" Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.208356 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-kjsvk" podStartSLOduration=122.208336889 podStartE2EDuration="2m2.208336889s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:15.176857145 +0000 UTC m=+142.002058669" watchObservedRunningTime="2025-12-09 11:34:15.208336889 +0000 UTC m=+142.033538413" Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.236686 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:15 crc kubenswrapper[4745]: E1209 11:34:15.236980 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:15.736965915 +0000 UTC m=+142.562167439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.255184 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rbfb" podStartSLOduration=122.25514576 podStartE2EDuration="2m2.25514576s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:15.208045791 +0000 UTC m=+142.033247315" watchObservedRunningTime="2025-12-09 11:34:15.25514576 +0000 UTC m=+142.080347284" Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.306873 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-8r64w" podStartSLOduration=122.306839496 podStartE2EDuration="2m2.306839496s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:15.272275356 +0000 UTC m=+142.097476880" watchObservedRunningTime="2025-12-09 11:34:15.306839496 +0000 UTC m=+142.132041020" Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.341136 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:15 crc kubenswrapper[4745]: E1209 11:34:15.341602 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:15.841581651 +0000 UTC m=+142.666783175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.344665 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" podStartSLOduration=122.344654717 podStartE2EDuration="2m2.344654717s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:15.307394791 +0000 UTC m=+142.132596325" watchObservedRunningTime="2025-12-09 11:34:15.344654717 +0000 UTC m=+142.169856241" Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.433128 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" podStartSLOduration=122.433109454 podStartE2EDuration="2m2.433109454s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:15.430410249 +0000 UTC m=+142.255611773" watchObservedRunningTime="2025-12-09 11:34:15.433109454 +0000 UTC m=+142.258310978" Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.441555 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:15 crc kubenswrapper[4745]: E1209 11:34:15.441797 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:15.941785255 +0000 UTC m=+142.766986779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.546240 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:15 crc kubenswrapper[4745]: E1209 11:34:15.546578 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:16.046564857 +0000 UTC m=+142.871766381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.580392 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b" event={"ID":"5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6","Type":"ContainerStarted","Data":"a3ae95dd50d44771526a8c7c6127e9e4125652ed38bbed7a1acb1725985b2b3e"} Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.580427 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" event={"ID":"73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e","Type":"ContainerStarted","Data":"08fc02e1a33aca7407c262c98eb2f0c01d8cff0393b240772ec4945f553c2ce2"} Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.585445 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrkr5" event={"ID":"5146ef22-be05-4272-bfa7-80452a0a908f","Type":"ContainerStarted","Data":"0ed4203b1aca5f8081a1d22308876d1342e673af41a2d78b85942a47ce6e3d9e"} Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.610581 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x" event={"ID":"aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b","Type":"ContainerStarted","Data":"7905ea8d57d729262e13e12938e79003f70ad1daf3f092d3ef3de641fe8cd543"} Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.640801 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" event={"ID":"c500e9ec-eac6-45e4-bb6d-209e92ffbdad","Type":"ContainerStarted","Data":"a2a901bb841c2f48e11512c8c0bf9a7b71ecae175db8c88f13900ccc6393d3a7"} Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.647925 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:15 crc kubenswrapper[4745]: E1209 11:34:15.648248 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:16.148233832 +0000 UTC m=+142.973435356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.663726 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" event={"ID":"ecd48ed5-6e8d-40f5-b3d1-1254df80a033","Type":"ContainerStarted","Data":"701d312fd1f75eaa0d7b74f5ab33d7dfdb9a91261a5c16fbdddb393df912285c"} Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.685430 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-hk5ns" podStartSLOduration=122.685414275 podStartE2EDuration="2m2.685414275s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:15.486136438 +0000 UTC m=+142.311337962" watchObservedRunningTime="2025-12-09 11:34:15.685414275 +0000 UTC m=+142.510615799" Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.715333 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.717350 4745 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-w5n22 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.717473 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" podUID="e5372cf2-1aac-4a33-96ae-f6e7e612195a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.729704 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49" event={"ID":"28f4c688-bb4b-4268-b2f5-21739872a26d","Type":"ContainerStarted","Data":"f2ceb21d183f67bfd2f5d4abad842c90c9fbebe77ab2d53251ceee568c19dcc8"} Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.747917 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" event={"ID":"770be29f-5b58-4569-a4a8-0618adb2ed5c","Type":"ContainerStarted","Data":"5de138f1c07e8a5f680ed2a3c9947c30babc50250d289f38a4aa5ee31f382482"} Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.748827 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.750014 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:15 crc kubenswrapper[4745]: E1209 11:34:15.751350 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:16.251338986 +0000 UTC m=+143.076540510 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.752457 4745 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-65bct container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.752540 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" podUID="770be29f-5b58-4569-a4a8-0618adb2ed5c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.754940 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" podStartSLOduration=122.754923876 podStartE2EDuration="2m2.754923876s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:15.752808307 +0000 UTC m=+142.578009841" watchObservedRunningTime="2025-12-09 11:34:15.754923876 +0000 UTC m=+142.580125400" Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.758216 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" podStartSLOduration=122.758208887 podStartE2EDuration="2m2.758208887s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:15.698543899 +0000 UTC m=+142.523745423" watchObservedRunningTime="2025-12-09 11:34:15.758208887 +0000 UTC m=+142.583410411" Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.775886 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nnh4w" event={"ID":"e697eb94-a732-4bb6-90c2-cd97e857b60b","Type":"ContainerStarted","Data":"46517d14930bfba853dfa66aad999f72af441e05d702dd1c828fe396771c691e"} Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.785479 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" podStartSLOduration=122.785466015 podStartE2EDuration="2m2.785466015s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:15.785113065 +0000 UTC m=+142.610314589" watchObservedRunningTime="2025-12-09 11:34:15.785466015 +0000 UTC m=+142.610667529" Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.828839 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p4p6k" event={"ID":"5fc24512-e828-4d9c-acde-b03f888e9474","Type":"ContainerStarted","Data":"94cc00e1f74ee1aa796a774b4fd7dbefc37858051bbde50d4880285eb1cc5fbf"} Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.846171 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr" event={"ID":"69e84e12-ef0d-46b5-8d5b-fde7b040e11b","Type":"ContainerStarted","Data":"8d6d6c1c9c00eb3397b203aa870439030be266ab0ad36d2e5d7a069a10fb7fab"} Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.853023 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:15 crc kubenswrapper[4745]: E1209 11:34:15.853429 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:16.353412722 +0000 UTC m=+143.178614246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.853604 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:15 crc kubenswrapper[4745]: E1209 11:34:15.853917 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:16.353910236 +0000 UTC m=+143.179111760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.877804 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" event={"ID":"321f0b7a-1cf4-4814-81d3-fe25fd718555","Type":"ContainerStarted","Data":"c4352fdd884bac4335c8a48267d9e0ed0119dbedf53424b3a6bfa14daaf8b0b3"} Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.910740 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" event={"ID":"4c700629-9c94-4e16-b584-e28f2103e68e","Type":"ContainerStarted","Data":"bc58e78151226ee94daf0ca8fc41790357067f5859ac4c8eaa36531f7abb06e0"} Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.923834 4745 patch_prober.go:28] interesting pod/router-default-5444994796-mnbl6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 11:34:15 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Dec 09 11:34:15 crc kubenswrapper[4745]: [+]process-running ok Dec 09 11:34:15 crc kubenswrapper[4745]: healthz check failed Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.923900 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mnbl6" podUID="852830c1-9fd1-4c23-807a-fef5c5934c82" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.944828 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" event={"ID":"4eb738d6-4a1b-4790-9376-0f416784bf8e","Type":"ContainerStarted","Data":"a52c3211dbdebdd1030bef5b627b7a4f5cdb311d574805cbf5b83c54424637da"} Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.955984 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:15 crc kubenswrapper[4745]: E1209 11:34:15.957165 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:16.457149635 +0000 UTC m=+143.282351159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.962829 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jv8nd" event={"ID":"3675753d-4d26-4ec5-9abd-4e6f8966f56a","Type":"ContainerStarted","Data":"a250d437be0ee7a769918834bd27a723b6903f379a74943432e61f15df253329"} Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.977726 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-vwmfn" podStartSLOduration=122.977706016 podStartE2EDuration="2m2.977706016s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:15.977284194 +0000 UTC m=+142.802485718" watchObservedRunningTime="2025-12-09 11:34:15.977706016 +0000 UTC m=+142.802907540" Dec 09 11:34:15 crc kubenswrapper[4745]: I1209 11:34:15.978323 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9hssr" podStartSLOduration=122.978318683 podStartE2EDuration="2m2.978318683s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:15.871962008 +0000 UTC m=+142.697163532" watchObservedRunningTime="2025-12-09 11:34:15.978318683 +0000 UTC m=+142.803520207" Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.004023 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvs6z" event={"ID":"cfc45955-ae9e-460c-97df-1d23f960c862","Type":"ContainerStarted","Data":"8673e4e39ac330fa9ce56a050c4ebe0cf8394387bf43e6a1a992d4828df51c9f"} Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.019132 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7ftnn" event={"ID":"62a21827-2dc4-47ee-88b8-af5721a23829","Type":"ContainerStarted","Data":"7220029df81ff05ac99097dcb615fa9226cd0d6b6cc2557ca9899521eedae523"} Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.027006 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj" event={"ID":"b47373e1-3530-4cfd-a139-02674d232523","Type":"ContainerStarted","Data":"ee57f7d1f67bbe4694faae255ecc709c30a980b1dff1c92d985be98e4edd23ae"} Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.034646 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" event={"ID":"719b893a-e0dd-4a09-86ba-2c5b177ba8b6","Type":"ContainerStarted","Data":"1bd982f1dad85dd531f99cbf106621f5bb78241e46716003cecc394803ba5fa0"} Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.045128 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" event={"ID":"8a071b7a-930e-46f6-91d3-3aefcacf5eec","Type":"ContainerStarted","Data":"8fb1f27faf76546976226098066c28d1aadc1dff7ab1eb55e28b1747c9179b7a"} Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.057409 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:16 crc kubenswrapper[4745]: E1209 11:34:16.059094 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:16.559082397 +0000 UTC m=+143.384283921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.088931 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" event={"ID":"52737279-7072-4bdb-9e9b-8e8cd58d44c9","Type":"ContainerStarted","Data":"35b45a9b077eebe3fb01eed28b030ac1b11b660f4d6f59793bc8aad74db353c6"} Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.108467 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b2kgm" podStartSLOduration=123.108452709 podStartE2EDuration="2m3.108452709s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:16.070161275 +0000 UTC m=+142.895362799" watchObservedRunningTime="2025-12-09 11:34:16.108452709 +0000 UTC m=+142.933654233" Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.116890 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-r79qf" event={"ID":"b5078140-21f5-4159-96af-69a29c27ec36","Type":"ContainerStarted","Data":"6d020595a536104569a1e7565a1e3e1f576a7e42c7cb06ed1a983d893aaaca5a"} Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.118893 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.118962 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.140071 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pmsng" Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.159140 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.160014 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:16 crc kubenswrapper[4745]: E1209 11:34:16.160112 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:16.660093234 +0000 UTC m=+143.485294758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.166827 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:16 crc kubenswrapper[4745]: E1209 11:34:16.168698 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:16.668682672 +0000 UTC m=+143.493884196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.188765 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-snhrr" podStartSLOduration=123.188708419 podStartE2EDuration="2m3.188708419s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:16.160082903 +0000 UTC m=+142.985284427" watchObservedRunningTime="2025-12-09 11:34:16.188708419 +0000 UTC m=+143.013909963" Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.199033 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4" podStartSLOduration=123.199012545 podStartE2EDuration="2m3.199012545s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:16.109491888 +0000 UTC m=+142.934693412" watchObservedRunningTime="2025-12-09 11:34:16.199012545 +0000 UTC m=+143.024214059" Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.199837 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-kjsvk" Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.275466 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:16 crc kubenswrapper[4745]: E1209 11:34:16.283927 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:16.783907964 +0000 UTC m=+143.609109488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.287682 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.287897 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.379275 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:16 crc kubenswrapper[4745]: E1209 11:34:16.379610 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:16.879598743 +0000 UTC m=+143.704800267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.482135 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:16 crc kubenswrapper[4745]: E1209 11:34:16.482855 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:16.982839071 +0000 UTC m=+143.808040595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.586257 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:16 crc kubenswrapper[4745]: E1209 11:34:16.586595 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:17.086580993 +0000 UTC m=+143.911782517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.686877 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:16 crc kubenswrapper[4745]: E1209 11:34:16.687224 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:17.187209109 +0000 UTC m=+144.012410633 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.789369 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:16 crc kubenswrapper[4745]: E1209 11:34:16.789873 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:17.289858651 +0000 UTC m=+144.115060175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.896695 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:16 crc kubenswrapper[4745]: E1209 11:34:16.897095 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:17.39707992 +0000 UTC m=+144.222281444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.914704 4745 patch_prober.go:28] interesting pod/router-default-5444994796-mnbl6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 11:34:16 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Dec 09 11:34:16 crc kubenswrapper[4745]: [+]process-running ok Dec 09 11:34:16 crc kubenswrapper[4745]: healthz check failed Dec 09 11:34:16 crc kubenswrapper[4745]: I1209 11:34:16.914750 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mnbl6" podUID="852830c1-9fd1-4c23-807a-fef5c5934c82" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.015229 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:17 crc kubenswrapper[4745]: E1209 11:34:17.015792 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:17.515777038 +0000 UTC m=+144.340978562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.122109 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:17 crc kubenswrapper[4745]: E1209 11:34:17.122461 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:17.622446302 +0000 UTC m=+144.447647826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.216624 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7ftnn" event={"ID":"62a21827-2dc4-47ee-88b8-af5721a23829","Type":"ContainerStarted","Data":"ec0d08cfdfe9d9ec122e8c9ecb51cdb9214a9056256b94f18037d961087f596e"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.225971 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:17 crc kubenswrapper[4745]: E1209 11:34:17.226304 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:17.726290687 +0000 UTC m=+144.551492211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.235008 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj" event={"ID":"b47373e1-3530-4cfd-a139-02674d232523","Type":"ContainerStarted","Data":"e0ffbf856e31238046b16ba958f43aa1f6e09efdc6fd17f70e5eb4ec867f34ca"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.259831 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jv8nd" event={"ID":"3675753d-4d26-4ec5-9abd-4e6f8966f56a","Type":"ContainerStarted","Data":"64231802e21ba52b90877c09aec2ad77c0d4ff94e385708f1a67130127fb2717"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.281835 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nnh4w" event={"ID":"e697eb94-a732-4bb6-90c2-cd97e857b60b","Type":"ContainerStarted","Data":"85834712c44c1730eeeeef6fb91f0ff8666fdb57fa71d6f71529ed998d26478d"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.296498 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc" event={"ID":"38d93c0b-6422-4a07-98dd-5a9eacdc1fa3","Type":"ContainerStarted","Data":"760eb8f293d4fe6426d1af110802859fb2b0b3ec4d1df86405a6fb48e122e85b"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.325154 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" event={"ID":"e0e69154-b28d-4d09-9fd7-3e28b08d20cd","Type":"ContainerStarted","Data":"1419abb257db66508cbe47ad4955a3b9efac78de94ca6d7cbd043245fc3495c3"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.326845 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:17 crc kubenswrapper[4745]: E1209 11:34:17.328969 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:17.828935049 +0000 UTC m=+144.654136563 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.357907 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vxndp" event={"ID":"1295af6d-c037-4fa4-adfb-1d43919d86ed","Type":"ContainerStarted","Data":"0858683838e1b01a6406caaab2c04c6904dabffc0289d0cf69ab8ec07e9faa4d"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.360087 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p4p6k" event={"ID":"5fc24512-e828-4d9c-acde-b03f888e9474","Type":"ContainerStarted","Data":"2c5ec19afe7227482cec11aaf6201c0206b327bd7040aed9132a0636e8888ba0"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.398190 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-j258s" event={"ID":"3824312f-b03f-42da-890e-53a61841a8b0","Type":"ContainerStarted","Data":"1ef0470226d272b89ae61d0ff568380a20f4fd7bc6c329d7aa20afbc7a0b12df"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.399156 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.409862 4745 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-j258s container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.36:6443/healthz\": dial tcp 10.217.0.36:6443: connect: connection refused" start-of-body= Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.409904 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-j258s" podUID="3824312f-b03f-42da-890e-53a61841a8b0" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.36:6443/healthz\": dial tcp 10.217.0.36:6443: connect: connection refused" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.421940 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" event={"ID":"321f0b7a-1cf4-4814-81d3-fe25fd718555","Type":"ContainerStarted","Data":"9e65093eff81331e3a3179300e68bd17b73e680b9559b3a0818c7507750b9176"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.422817 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.428775 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.429648 4745 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-72wx4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" start-of-body= Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.429704 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" podUID="321f0b7a-1cf4-4814-81d3-fe25fd718555" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" Dec 09 11:34:17 crc kubenswrapper[4745]: E1209 11:34:17.432116 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:17.932103676 +0000 UTC m=+144.757305200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.443760 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b" event={"ID":"5eb2aeb6-3b91-4f8e-a400-1c3c8f852ef6","Type":"ContainerStarted","Data":"fb9e2c11733b9c47fd5e25386de5ba87fe1949a298743c3613265cd51c823abe"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.470784 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" event={"ID":"73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e","Type":"ContainerStarted","Data":"ae661e0e583e18c987238763ba160162717920f524352537c2bd3998c12560d6"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.471614 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.476494 4745 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-zhvrm container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.476561 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" podUID="73a4ef64-9cfd-4e37-b2a8-8d256f30ed9e" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.503561 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrkr5" event={"ID":"5146ef22-be05-4272-bfa7-80452a0a908f","Type":"ContainerStarted","Data":"429ce0d1de8b93e92569aef5692b21d043e86df456ab30caff65a3a047824dd7"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.503605 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrkr5" event={"ID":"5146ef22-be05-4272-bfa7-80452a0a908f","Type":"ContainerStarted","Data":"c2183bec5b2631061b74a75f31bfbbdfa95746ae7215694ac3f0b9cf86a6881d"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.533957 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mcrk5" event={"ID":"8c51afdf-fde4-4147-814c-8befb1ad7d1f","Type":"ContainerStarted","Data":"f72139f30c178972a682e877c9b5413f131fc035475f920ff9649dd60a51e8bb"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.536063 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:17 crc kubenswrapper[4745]: E1209 11:34:17.537331 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:18.037311789 +0000 UTC m=+144.862513313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.550050 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nnh4w" podStartSLOduration=124.550030412 podStartE2EDuration="2m4.550030412s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:17.489010197 +0000 UTC m=+144.314211721" watchObservedRunningTime="2025-12-09 11:34:17.550030412 +0000 UTC m=+144.375231936" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.550278 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8vlj" podStartSLOduration=124.550272219 podStartE2EDuration="2m4.550272219s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:17.272990275 +0000 UTC m=+144.098191799" watchObservedRunningTime="2025-12-09 11:34:17.550272219 +0000 UTC m=+144.375473743" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.554854 4745 generic.go:334] "Generic (PLEG): container finished" podID="719b893a-e0dd-4a09-86ba-2c5b177ba8b6" containerID="412efb6f084d995bd605fef6b7fb86510a8945062da7921c1e70ae15db9d9c27" exitCode=0 Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.554930 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" event={"ID":"719b893a-e0dd-4a09-86ba-2c5b177ba8b6","Type":"ContainerDied","Data":"412efb6f084d995bd605fef6b7fb86510a8945062da7921c1e70ae15db9d9c27"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.595428 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x" event={"ID":"aaf1dc1d-aae4-4a14-bd7d-8d3fbd17752b","Type":"ContainerStarted","Data":"3cfe49f4de38ed9260c42812d966d34902d567664244810b19158dc85e0f0824"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.633749 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6nj4" event={"ID":"a41a4525-7677-4db8-b040-d4c1edfcc9a0","Type":"ContainerStarted","Data":"9fdf42a35c277f2454b5f8c42c1d773c4ec181d875d3f4a9f2838cfe8e864a99"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.635856 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" event={"ID":"4c700629-9c94-4e16-b584-e28f2103e68e","Type":"ContainerStarted","Data":"f573b96aee778208651ddf30e24cc869c3d308278f88275115ec28f666d27e20"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.637943 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:17 crc kubenswrapper[4745]: E1209 11:34:17.644345 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:18.144326032 +0000 UTC m=+144.969527556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.645091 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c" event={"ID":"ba30ed9b-bec0-4977-ad91-91128d2d7636","Type":"ContainerStarted","Data":"889e6c62731bb4f9d08de7e7a2672f424eabe9d7c306aa72a60b705121c71b9e"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.645144 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c" event={"ID":"ba30ed9b-bec0-4977-ad91-91128d2d7636","Type":"ContainerStarted","Data":"75a556eb91a5e86c3ba2c7e24a34da965e2aea616b37272cae543270a9d3553b"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.682786 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" event={"ID":"ecd48ed5-6e8d-40f5-b3d1-1254df80a033","Type":"ContainerStarted","Data":"225ee8b0639543d5dff5ae33d4a25cd69f17c6e2179ae6e65a1ba3dc207d8c9a"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.700910 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-j258s" podStartSLOduration=124.700894894 podStartE2EDuration="2m4.700894894s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:17.691212145 +0000 UTC m=+144.516413659" watchObservedRunningTime="2025-12-09 11:34:17.700894894 +0000 UTC m=+144.526096408" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.740608 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:17 crc kubenswrapper[4745]: E1209 11:34:17.741579 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:18.241552634 +0000 UTC m=+145.066754158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.754109 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" event={"ID":"e5372cf2-1aac-4a33-96ae-f6e7e612195a","Type":"ContainerStarted","Data":"2cac00cc44cdab96ada68f056ca4b064ff081b3dedac581cfcaf5750bd4e0b4a"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.755463 4745 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-w5n22 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.755491 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" podUID="e5372cf2-1aac-4a33-96ae-f6e7e612195a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.791421 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49" event={"ID":"28f4c688-bb4b-4268-b2f5-21739872a26d","Type":"ContainerStarted","Data":"9348073c332a37210f402248e4adef61774bd29a4e0f36cfe72de135cc03488a"} Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.791464 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.869946 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65bct" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.873797 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:17 crc kubenswrapper[4745]: E1209 11:34:17.874300 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:18.374284082 +0000 UTC m=+145.199485596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.911119 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-vxndp" podStartSLOduration=124.911100895 podStartE2EDuration="2m4.911100895s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:17.833323124 +0000 UTC m=+144.658524648" watchObservedRunningTime="2025-12-09 11:34:17.911100895 +0000 UTC m=+144.736302419" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.912333 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-c87ls" podStartSLOduration=124.912326769 podStartE2EDuration="2m4.912326769s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:17.910709564 +0000 UTC m=+144.735911088" watchObservedRunningTime="2025-12-09 11:34:17.912326769 +0000 UTC m=+144.737528293" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.921668 4745 patch_prober.go:28] interesting pod/router-default-5444994796-mnbl6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 11:34:17 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Dec 09 11:34:17 crc kubenswrapper[4745]: [+]process-running ok Dec 09 11:34:17 crc kubenswrapper[4745]: healthz check failed Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.921971 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mnbl6" podUID="852830c1-9fd1-4c23-807a-fef5c5934c82" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.961065 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47jpc" podStartSLOduration=124.961049752 podStartE2EDuration="2m4.961049752s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:17.959957452 +0000 UTC m=+144.785158976" watchObservedRunningTime="2025-12-09 11:34:17.961049752 +0000 UTC m=+144.786251276" Dec 09 11:34:17 crc kubenswrapper[4745]: I1209 11:34:17.987882 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:17 crc kubenswrapper[4745]: E1209 11:34:17.989064 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:18.48904813 +0000 UTC m=+145.314249654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.056606 4745 patch_prober.go:28] interesting pod/apiserver-76f77b778f-lwwlq container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 09 11:34:18 crc kubenswrapper[4745]: [+]log ok Dec 09 11:34:18 crc kubenswrapper[4745]: [+]etcd ok Dec 09 11:34:18 crc kubenswrapper[4745]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 09 11:34:18 crc kubenswrapper[4745]: [+]poststarthook/generic-apiserver-start-informers ok Dec 09 11:34:18 crc kubenswrapper[4745]: [+]poststarthook/max-in-flight-filter ok Dec 09 11:34:18 crc kubenswrapper[4745]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 09 11:34:18 crc kubenswrapper[4745]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 09 11:34:18 crc kubenswrapper[4745]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 09 11:34:18 crc kubenswrapper[4745]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 09 11:34:18 crc kubenswrapper[4745]: [+]poststarthook/project.openshift.io-projectcache ok Dec 09 11:34:18 crc kubenswrapper[4745]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 09 11:34:18 crc kubenswrapper[4745]: [+]poststarthook/openshift.io-startinformers ok Dec 09 11:34:18 crc kubenswrapper[4745]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 09 11:34:18 crc kubenswrapper[4745]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 09 11:34:18 crc kubenswrapper[4745]: livez check failed Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.056663 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" podUID="c500e9ec-eac6-45e4-bb6d-209e92ffbdad" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.063045 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" podStartSLOduration=125.063027416 podStartE2EDuration="2m5.063027416s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:18.061343 +0000 UTC m=+144.886544524" watchObservedRunningTime="2025-12-09 11:34:18.063027416 +0000 UTC m=+144.888228940" Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.091381 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:18 crc kubenswrapper[4745]: E1209 11:34:18.091770 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:18.591755744 +0000 UTC m=+145.416957268 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.106970 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p4p6k" podStartSLOduration=9.106953346 podStartE2EDuration="9.106953346s" podCreationTimestamp="2025-12-09 11:34:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:18.102901434 +0000 UTC m=+144.928102968" watchObservedRunningTime="2025-12-09 11:34:18.106953346 +0000 UTC m=+144.932154870" Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.162421 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" podStartSLOduration=125.162400207 podStartE2EDuration="2m5.162400207s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:18.161302756 +0000 UTC m=+144.986504270" watchObservedRunningTime="2025-12-09 11:34:18.162400207 +0000 UTC m=+144.987601731" Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.198259 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:18 crc kubenswrapper[4745]: E1209 11:34:18.199782 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:18.699765975 +0000 UTC m=+145.524967499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.220206 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sxw5b" podStartSLOduration=125.220191873 podStartE2EDuration="2m5.220191873s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:18.218815094 +0000 UTC m=+145.044016618" watchObservedRunningTime="2025-12-09 11:34:18.220191873 +0000 UTC m=+145.045393397" Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.258358 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49" podStartSLOduration=125.258342893 podStartE2EDuration="2m5.258342893s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:18.255880194 +0000 UTC m=+145.081081718" watchObservedRunningTime="2025-12-09 11:34:18.258342893 +0000 UTC m=+145.083544407" Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.300253 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:18 crc kubenswrapper[4745]: E1209 11:34:18.300629 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:18.800614267 +0000 UTC m=+145.625815791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.325411 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2qt5c" podStartSLOduration=125.325395616 podStartE2EDuration="2m5.325395616s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:18.290759493 +0000 UTC m=+145.115961017" watchObservedRunningTime="2025-12-09 11:34:18.325395616 +0000 UTC m=+145.150597140" Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.355380 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-mcrk5" podStartSLOduration=125.355361828 podStartE2EDuration="2m5.355361828s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:18.326945139 +0000 UTC m=+145.152146663" watchObservedRunningTime="2025-12-09 11:34:18.355361828 +0000 UTC m=+145.180563352" Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.392568 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" podStartSLOduration=125.392549592 podStartE2EDuration="2m5.392549592s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:18.391783081 +0000 UTC m=+145.216984615" watchObservedRunningTime="2025-12-09 11:34:18.392549592 +0000 UTC m=+145.217751126" Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.392855 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrkr5" podStartSLOduration=125.39284976 podStartE2EDuration="2m5.39284976s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:18.358411263 +0000 UTC m=+145.183612787" watchObservedRunningTime="2025-12-09 11:34:18.39284976 +0000 UTC m=+145.218051284" Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.409043 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:18 crc kubenswrapper[4745]: E1209 11:34:18.409414 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:18.90939629 +0000 UTC m=+145.734597814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.511173 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:18 crc kubenswrapper[4745]: E1209 11:34:18.511579 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:19.011562528 +0000 UTC m=+145.836764052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.611908 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:18 crc kubenswrapper[4745]: E1209 11:34:18.612020 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:19.111995619 +0000 UTC m=+145.937197143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.612416 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:18 crc kubenswrapper[4745]: E1209 11:34:18.612719 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:19.112706959 +0000 UTC m=+145.937908473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.622457 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ppx9x" podStartSLOduration=125.622436999 podStartE2EDuration="2m5.622436999s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:18.549601835 +0000 UTC m=+145.374803359" watchObservedRunningTime="2025-12-09 11:34:18.622436999 +0000 UTC m=+145.447638523" Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.713563 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:18 crc kubenswrapper[4745]: E1209 11:34:18.713970 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:19.213954602 +0000 UTC m=+146.039156126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.814932 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:18 crc kubenswrapper[4745]: E1209 11:34:18.815264 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:19.315253417 +0000 UTC m=+146.140454941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.846723 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49" event={"ID":"28f4c688-bb4b-4268-b2f5-21739872a26d","Type":"ContainerStarted","Data":"d8e14308d7e7a9dd7f84e5a8c5cca6e4ee2641ed3de165e42b5b0025d7933cd8"} Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.870831 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-r79qf" event={"ID":"b5078140-21f5-4159-96af-69a29c27ec36","Type":"ContainerStarted","Data":"648031cdccdcb63292bbb69261c6841523f088307467e391973781dbafc28aed"} Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.910611 4745 patch_prober.go:28] interesting pod/router-default-5444994796-mnbl6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 11:34:18 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Dec 09 11:34:18 crc kubenswrapper[4745]: [+]process-running ok Dec 09 11:34:18 crc kubenswrapper[4745]: healthz check failed Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.910669 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mnbl6" podUID="852830c1-9fd1-4c23-807a-fef5c5934c82" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.911614 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jv8nd" event={"ID":"3675753d-4d26-4ec5-9abd-4e6f8966f56a","Type":"ContainerStarted","Data":"869ecc6694e229049fc3f3f99c49267734205aa04ffb8f09c36295b8922d6a76"} Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.912040 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-jv8nd" Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.917981 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" event={"ID":"719b893a-e0dd-4a09-86ba-2c5b177ba8b6","Type":"ContainerStarted","Data":"3afc372a593df0d460eb1702cf09bcdd4bb9c5d784fdbd3a20ab21be5a37e20a"} Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.918550 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:18 crc kubenswrapper[4745]: E1209 11:34:18.919410 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:19.41939242 +0000 UTC m=+146.244593934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.941066 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vxndp" event={"ID":"1295af6d-c037-4fa4-adfb-1d43919d86ed","Type":"ContainerStarted","Data":"69deb15a8befc090730ac64419f0c01f7201441625f7acfa53954cf6360781e4"} Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.966777 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7ftnn" event={"ID":"62a21827-2dc4-47ee-88b8-af5721a23829","Type":"ContainerStarted","Data":"a9f900953e523efbfbf195a1cd300c6f96e87f82341547bab0353e86269a306f"} Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.988908 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlvvb" event={"ID":"4c700629-9c94-4e16-b584-e28f2103e68e","Type":"ContainerStarted","Data":"9c6fdde2d85696ad80028cd2910efddca6abec8a067c04ff7d0d8a4418e1f8f0"} Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.995044 4745 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-w5n22 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 09 11:34:18 crc kubenswrapper[4745]: I1209 11:34:18.995268 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" podUID="e5372cf2-1aac-4a33-96ae-f6e7e612195a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.005995 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" podStartSLOduration=126.005975916 podStartE2EDuration="2m6.005975916s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:18.624819295 +0000 UTC m=+145.450020819" watchObservedRunningTime="2025-12-09 11:34:19.005975916 +0000 UTC m=+145.831177440" Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.009903 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zhvrm" Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.024009 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:19 crc kubenswrapper[4745]: E1209 11:34:19.030050 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:19.530037495 +0000 UTC m=+146.355239019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.127104 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:19 crc kubenswrapper[4745]: E1209 11:34:19.127484 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:19.627467262 +0000 UTC m=+146.452668786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.149931 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.190687 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jv8nd" podStartSLOduration=10.190671958 podStartE2EDuration="10.190671958s" podCreationTimestamp="2025-12-09 11:34:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:19.005106192 +0000 UTC m=+145.830307716" watchObservedRunningTime="2025-12-09 11:34:19.190671958 +0000 UTC m=+146.015873472" Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.228638 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:19 crc kubenswrapper[4745]: E1209 11:34:19.229010 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:19.728995643 +0000 UTC m=+146.554197167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.249566 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-7ftnn" podStartSLOduration=126.249547044 podStartE2EDuration="2m6.249547044s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:19.192989782 +0000 UTC m=+146.018191306" watchObservedRunningTime="2025-12-09 11:34:19.249547044 +0000 UTC m=+146.074748568" Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.314116 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" podStartSLOduration=126.314097377 podStartE2EDuration="2m6.314097377s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:19.276905384 +0000 UTC m=+146.102106908" watchObservedRunningTime="2025-12-09 11:34:19.314097377 +0000 UTC m=+146.139298931" Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.329917 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:19 crc kubenswrapper[4745]: E1209 11:34:19.330122 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:19.830093592 +0000 UTC m=+146.655295116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.330237 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:19 crc kubenswrapper[4745]: E1209 11:34:19.330617 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:19.830606976 +0000 UTC m=+146.655808500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.431620 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:19 crc kubenswrapper[4745]: E1209 11:34:19.431828 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:19.931797797 +0000 UTC m=+146.756999321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.431932 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:19 crc kubenswrapper[4745]: E1209 11:34:19.432231 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:19.932223529 +0000 UTC m=+146.757425053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.532891 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:19 crc kubenswrapper[4745]: E1209 11:34:19.533076 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.033046741 +0000 UTC m=+146.858248265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.533158 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:19 crc kubenswrapper[4745]: E1209 11:34:19.533493 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.033479373 +0000 UTC m=+146.858680897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.633982 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:19 crc kubenswrapper[4745]: E1209 11:34:19.634112 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.134086828 +0000 UTC m=+146.959288352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.634214 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:19 crc kubenswrapper[4745]: E1209 11:34:19.634573 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.134565381 +0000 UTC m=+146.959766905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.735337 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:19 crc kubenswrapper[4745]: E1209 11:34:19.735549 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.235523987 +0000 UTC m=+147.060725511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.735607 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:19 crc kubenswrapper[4745]: E1209 11:34:19.735927 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.235919408 +0000 UTC m=+147.061120932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.836036 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:19 crc kubenswrapper[4745]: E1209 11:34:19.836244 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.336220874 +0000 UTC m=+147.161422398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.855430 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s5fp7"] Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.856479 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.858583 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.871173 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s5fp7"] Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.910045 4745 patch_prober.go:28] interesting pod/router-default-5444994796-mnbl6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 11:34:19 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Dec 09 11:34:19 crc kubenswrapper[4745]: [+]process-running ok Dec 09 11:34:19 crc kubenswrapper[4745]: healthz check failed Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.910107 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mnbl6" podUID="852830c1-9fd1-4c23-807a-fef5c5934c82" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.938178 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81940df1-47e0-46fa-9215-5751defbf8c0-utilities\") pod \"community-operators-s5fp7\" (UID: \"81940df1-47e0-46fa-9215-5751defbf8c0\") " pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.938224 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.938276 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlr2m\" (UniqueName: \"kubernetes.io/projected/81940df1-47e0-46fa-9215-5751defbf8c0-kube-api-access-dlr2m\") pod \"community-operators-s5fp7\" (UID: \"81940df1-47e0-46fa-9215-5751defbf8c0\") " pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.938336 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81940df1-47e0-46fa-9215-5751defbf8c0-catalog-content\") pod \"community-operators-s5fp7\" (UID: \"81940df1-47e0-46fa-9215-5751defbf8c0\") " pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:34:19 crc kubenswrapper[4745]: E1209 11:34:19.938659 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.43864083 +0000 UTC m=+147.263842434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.989918 4745 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-72wx4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.989981 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" podUID="321f0b7a-1cf4-4814-81d3-fe25fd718555" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 09 11:34:19 crc kubenswrapper[4745]: I1209 11:34:19.993501 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-r79qf" event={"ID":"b5078140-21f5-4159-96af-69a29c27ec36","Type":"ContainerStarted","Data":"6831aea37227480e35ef2d86bb2eedd7cc914968ec16a2621fc85799d70ec338"} Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.041105 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:20 crc kubenswrapper[4745]: E1209 11:34:20.041394 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.541365404 +0000 UTC m=+147.366566928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.041609 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlr2m\" (UniqueName: \"kubernetes.io/projected/81940df1-47e0-46fa-9215-5751defbf8c0-kube-api-access-dlr2m\") pod \"community-operators-s5fp7\" (UID: \"81940df1-47e0-46fa-9215-5751defbf8c0\") " pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.041840 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81940df1-47e0-46fa-9215-5751defbf8c0-catalog-content\") pod \"community-operators-s5fp7\" (UID: \"81940df1-47e0-46fa-9215-5751defbf8c0\") " pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.042039 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81940df1-47e0-46fa-9215-5751defbf8c0-utilities\") pod \"community-operators-s5fp7\" (UID: \"81940df1-47e0-46fa-9215-5751defbf8c0\") " pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.042095 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.044455 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81940df1-47e0-46fa-9215-5751defbf8c0-catalog-content\") pod \"community-operators-s5fp7\" (UID: \"81940df1-47e0-46fa-9215-5751defbf8c0\") " pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.045146 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81940df1-47e0-46fa-9215-5751defbf8c0-utilities\") pod \"community-operators-s5fp7\" (UID: \"81940df1-47e0-46fa-9215-5751defbf8c0\") " pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:34:20 crc kubenswrapper[4745]: E1209 11:34:20.045434 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.545420177 +0000 UTC m=+147.370621691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.060523 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zrmwh"] Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.061367 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.068184 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.076476 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlr2m\" (UniqueName: \"kubernetes.io/projected/81940df1-47e0-46fa-9215-5751defbf8c0-kube-api-access-dlr2m\") pod \"community-operators-s5fp7\" (UID: \"81940df1-47e0-46fa-9215-5751defbf8c0\") " pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.079697 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zrmwh"] Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.145344 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:20 crc kubenswrapper[4745]: E1209 11:34:20.145744 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.645727853 +0000 UTC m=+147.470929377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.172345 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.246858 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2ab340c-642b-4f8d-bc1f-adebf5e79418-utilities\") pod \"certified-operators-zrmwh\" (UID: \"e2ab340c-642b-4f8d-bc1f-adebf5e79418\") " pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.247195 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd6g6\" (UniqueName: \"kubernetes.io/projected/e2ab340c-642b-4f8d-bc1f-adebf5e79418-kube-api-access-dd6g6\") pod \"certified-operators-zrmwh\" (UID: \"e2ab340c-642b-4f8d-bc1f-adebf5e79418\") " pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.247250 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2ab340c-642b-4f8d-bc1f-adebf5e79418-catalog-content\") pod \"certified-operators-zrmwh\" (UID: \"e2ab340c-642b-4f8d-bc1f-adebf5e79418\") " pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.247290 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:20 crc kubenswrapper[4745]: E1209 11:34:20.247622 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.747609904 +0000 UTC m=+147.572811428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.279597 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zt7zb"] Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.281581 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zt7zb"] Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.281722 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.351923 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:20 crc kubenswrapper[4745]: E1209 11:34:20.352003 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.851987614 +0000 UTC m=+147.677189128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.352370 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e36127f-1987-4067-bf22-38a1bc134721-utilities\") pod \"community-operators-zt7zb\" (UID: \"7e36127f-1987-4067-bf22-38a1bc134721\") " pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.352414 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhcfj\" (UniqueName: \"kubernetes.io/projected/7e36127f-1987-4067-bf22-38a1bc134721-kube-api-access-nhcfj\") pod \"community-operators-zt7zb\" (UID: \"7e36127f-1987-4067-bf22-38a1bc134721\") " pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.352454 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2ab340c-642b-4f8d-bc1f-adebf5e79418-catalog-content\") pod \"certified-operators-zrmwh\" (UID: \"e2ab340c-642b-4f8d-bc1f-adebf5e79418\") " pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.352496 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.352549 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2ab340c-642b-4f8d-bc1f-adebf5e79418-utilities\") pod \"certified-operators-zrmwh\" (UID: \"e2ab340c-642b-4f8d-bc1f-adebf5e79418\") " pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.352575 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.352604 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.352650 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.352690 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e36127f-1987-4067-bf22-38a1bc134721-catalog-content\") pod \"community-operators-zt7zb\" (UID: \"7e36127f-1987-4067-bf22-38a1bc134721\") " pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.352729 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd6g6\" (UniqueName: \"kubernetes.io/projected/e2ab340c-642b-4f8d-bc1f-adebf5e79418-kube-api-access-dd6g6\") pod \"certified-operators-zrmwh\" (UID: \"e2ab340c-642b-4f8d-bc1f-adebf5e79418\") " pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.352755 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.356385 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2ab340c-642b-4f8d-bc1f-adebf5e79418-utilities\") pod \"certified-operators-zrmwh\" (UID: \"e2ab340c-642b-4f8d-bc1f-adebf5e79418\") " pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.356752 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2ab340c-642b-4f8d-bc1f-adebf5e79418-catalog-content\") pod \"certified-operators-zrmwh\" (UID: \"e2ab340c-642b-4f8d-bc1f-adebf5e79418\") " pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:34:20 crc kubenswrapper[4745]: E1209 11:34:20.357030 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.857016254 +0000 UTC m=+147.682217778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.383006 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.397722 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.400981 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.404231 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.408371 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd6g6\" (UniqueName: \"kubernetes.io/projected/e2ab340c-642b-4f8d-bc1f-adebf5e79418-kube-api-access-dd6g6\") pod \"certified-operators-zrmwh\" (UID: \"e2ab340c-642b-4f8d-bc1f-adebf5e79418\") " pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.436769 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.441071 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2mvnf"] Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.442576 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.454122 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.454346 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e36127f-1987-4067-bf22-38a1bc134721-catalog-content\") pod \"community-operators-zt7zb\" (UID: \"7e36127f-1987-4067-bf22-38a1bc134721\") " pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.454394 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e36127f-1987-4067-bf22-38a1bc134721-utilities\") pod \"community-operators-zt7zb\" (UID: \"7e36127f-1987-4067-bf22-38a1bc134721\") " pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.454416 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhcfj\" (UniqueName: \"kubernetes.io/projected/7e36127f-1987-4067-bf22-38a1bc134721-kube-api-access-nhcfj\") pod \"community-operators-zt7zb\" (UID: \"7e36127f-1987-4067-bf22-38a1bc134721\") " pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:34:20 crc kubenswrapper[4745]: E1209 11:34:20.455204 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:20.955188022 +0000 UTC m=+147.780389546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.455602 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e36127f-1987-4067-bf22-38a1bc134721-catalog-content\") pod \"community-operators-zt7zb\" (UID: \"7e36127f-1987-4067-bf22-38a1bc134721\") " pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.455822 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e36127f-1987-4067-bf22-38a1bc134721-utilities\") pod \"community-operators-zt7zb\" (UID: \"7e36127f-1987-4067-bf22-38a1bc134721\") " pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.456359 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2mvnf"] Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.477194 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.485852 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.510128 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.557445 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtvxs\" (UniqueName: \"kubernetes.io/projected/b1305324-76f0-474c-8933-599d9b6eaff4-kube-api-access-xtvxs\") pod \"certified-operators-2mvnf\" (UID: \"b1305324-76f0-474c-8933-599d9b6eaff4\") " pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.557579 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1305324-76f0-474c-8933-599d9b6eaff4-catalog-content\") pod \"certified-operators-2mvnf\" (UID: \"b1305324-76f0-474c-8933-599d9b6eaff4\") " pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.557620 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1305324-76f0-474c-8933-599d9b6eaff4-utilities\") pod \"certified-operators-2mvnf\" (UID: \"b1305324-76f0-474c-8933-599d9b6eaff4\") " pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.557692 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:20 crc kubenswrapper[4745]: E1209 11:34:20.557949 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:21.057937486 +0000 UTC m=+147.883139010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.570324 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhcfj\" (UniqueName: \"kubernetes.io/projected/7e36127f-1987-4067-bf22-38a1bc134721-kube-api-access-nhcfj\") pod \"community-operators-zt7zb\" (UID: \"7e36127f-1987-4067-bf22-38a1bc134721\") " pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.666203 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.666385 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.666555 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtvxs\" (UniqueName: \"kubernetes.io/projected/b1305324-76f0-474c-8933-599d9b6eaff4-kube-api-access-xtvxs\") pod \"certified-operators-2mvnf\" (UID: \"b1305324-76f0-474c-8933-599d9b6eaff4\") " pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.666590 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1305324-76f0-474c-8933-599d9b6eaff4-catalog-content\") pod \"certified-operators-2mvnf\" (UID: \"b1305324-76f0-474c-8933-599d9b6eaff4\") " pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.666210 4745 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.666613 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1305324-76f0-474c-8933-599d9b6eaff4-utilities\") pod \"certified-operators-2mvnf\" (UID: \"b1305324-76f0-474c-8933-599d9b6eaff4\") " pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:34:20 crc kubenswrapper[4745]: E1209 11:34:20.666664 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:21.166638317 +0000 UTC m=+147.991839841 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.666707 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.667007 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1305324-76f0-474c-8933-599d9b6eaff4-utilities\") pod \"certified-operators-2mvnf\" (UID: \"b1305324-76f0-474c-8933-599d9b6eaff4\") " pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:34:20 crc kubenswrapper[4745]: E1209 11:34:20.667039 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 11:34:21.167032368 +0000 UTC m=+147.992233892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lsfbd" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.667156 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1305324-76f0-474c-8933-599d9b6eaff4-catalog-content\") pod \"certified-operators-2mvnf\" (UID: \"b1305324-76f0-474c-8933-599d9b6eaff4\") " pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.688722 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtvxs\" (UniqueName: \"kubernetes.io/projected/b1305324-76f0-474c-8933-599d9b6eaff4-kube-api-access-xtvxs\") pod \"certified-operators-2mvnf\" (UID: \"b1305324-76f0-474c-8933-599d9b6eaff4\") " pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.768192 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:20 crc kubenswrapper[4745]: E1209 11:34:20.768477 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 11:34:21.268460606 +0000 UTC m=+148.093662130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.781828 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.795965 4745 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-09T11:34:20.666616696Z","Handler":null,"Name":""} Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.810000 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72wx4" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.817239 4745 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.817269 4745 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.876309 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.911168 4745 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.911207 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.918864 4745 patch_prober.go:28] interesting pod/router-default-5444994796-mnbl6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 11:34:20 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Dec 09 11:34:20 crc kubenswrapper[4745]: [+]process-running ok Dec 09 11:34:20 crc kubenswrapper[4745]: healthz check failed Dec 09 11:34:20 crc kubenswrapper[4745]: I1209 11:34:20.918929 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mnbl6" podUID="852830c1-9fd1-4c23-807a-fef5c5934c82" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.053546 4745 generic.go:334] "Generic (PLEG): container finished" podID="ecd48ed5-6e8d-40f5-b3d1-1254df80a033" containerID="225ee8b0639543d5dff5ae33d4a25cd69f17c6e2179ae6e65a1ba3dc207d8c9a" exitCode=0 Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.053621 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" event={"ID":"ecd48ed5-6e8d-40f5-b3d1-1254df80a033","Type":"ContainerDied","Data":"225ee8b0639543d5dff5ae33d4a25cd69f17c6e2179ae6e65a1ba3dc207d8c9a"} Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.082658 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-r79qf" event={"ID":"b5078140-21f5-4159-96af-69a29c27ec36","Type":"ContainerStarted","Data":"406096044b9408b16759d1892770218598ded07872feeda1815862a26eaf122b"} Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.127931 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lsfbd\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.184586 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.202291 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s5fp7"] Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.213274 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 11:34:21 crc kubenswrapper[4745]: W1209 11:34:21.252754 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81940df1_47e0_46fa_9215_5751defbf8c0.slice/crio-b42afdf27f5dfb60e05c782b8ac0939dcb7f0e7b854d9b00d8810db002d9d4dc WatchSource:0}: Error finding container b42afdf27f5dfb60e05c782b8ac0939dcb7f0e7b854d9b00d8810db002d9d4dc: Status 404 returned error can't find the container with id b42afdf27f5dfb60e05c782b8ac0939dcb7f0e7b854d9b00d8810db002d9d4dc Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.282275 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.309494 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.320161 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-lwwlq" Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.447113 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zrmwh"] Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.593209 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 09 11:34:21 crc kubenswrapper[4745]: W1209 11:34:21.608592 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-396646a6bb5da780678a3891e79a74a618bcb70158e9ff7ae25a28a425093823 WatchSource:0}: Error finding container 396646a6bb5da780678a3891e79a74a618bcb70158e9ff7ae25a28a425093823: Status 404 returned error can't find the container with id 396646a6bb5da780678a3891e79a74a618bcb70158e9ff7ae25a28a425093823 Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.691400 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zt7zb"] Dec 09 11:34:21 crc kubenswrapper[4745]: W1209 11:34:21.712327 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e36127f_1987_4067_bf22_38a1bc134721.slice/crio-526d902a9bdc618f0334d274226e0085bd68f5545106e478b23b06fd694bb71e WatchSource:0}: Error finding container 526d902a9bdc618f0334d274226e0085bd68f5545106e478b23b06fd694bb71e: Status 404 returned error can't find the container with id 526d902a9bdc618f0334d274226e0085bd68f5545106e478b23b06fd694bb71e Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.787492 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.787593 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.787654 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.787703 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.798599 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.798653 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.799815 4745 patch_prober.go:28] interesting pod/console-f9d7485db-mcrk5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.799854 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-mcrk5" podUID="8c51afdf-fde4-4147-814c-8befb1ad7d1f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.869779 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lsfbd"] Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.904260 4745 patch_prober.go:28] interesting pod/router-default-5444994796-mnbl6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 11:34:21 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Dec 09 11:34:21 crc kubenswrapper[4745]: [+]process-running ok Dec 09 11:34:21 crc kubenswrapper[4745]: healthz check failed Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.904321 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mnbl6" podUID="852830c1-9fd1-4c23-807a-fef5c5934c82" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 11:34:21 crc kubenswrapper[4745]: W1209 11:34:21.909305 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode774e15d_2c99_453e_9c78_4fde0bf037fc.slice/crio-7c87d26157cab0336c08a51c3e6cf3d8641961b6c17d93b07405c62a5c728810 WatchSource:0}: Error finding container 7c87d26157cab0336c08a51c3e6cf3d8641961b6c17d93b07405c62a5c728810: Status 404 returned error can't find the container with id 7c87d26157cab0336c08a51c3e6cf3d8641961b6c17d93b07405c62a5c728810 Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.956015 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2mvnf"] Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.972201 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.973013 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 11:34:21 crc kubenswrapper[4745]: W1209 11:34:21.973640 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1305324_76f0_474c_8933_599d9b6eaff4.slice/crio-cde86d328f67b09dccf2b15257300577b452a9dced8449be9e8211db62682165 WatchSource:0}: Error finding container cde86d328f67b09dccf2b15257300577b452a9dced8449be9e8211db62682165: Status 404 returned error can't find the container with id cde86d328f67b09dccf2b15257300577b452a9dced8449be9e8211db62682165 Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.975203 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.975435 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 09 11:34:21 crc kubenswrapper[4745]: I1209 11:34:21.986377 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.000882 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37ed6c08-efb6-4db2-98ec-c881d5ea03ce-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"37ed6c08-efb6-4db2-98ec-c881d5ea03ce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.001045 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37ed6c08-efb6-4db2-98ec-c881d5ea03ce-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"37ed6c08-efb6-4db2-98ec-c881d5ea03ce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.046736 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xnwjq"] Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.060204 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.062747 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.063409 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xnwjq"] Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.096962 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" event={"ID":"e774e15d-2c99-453e-9c78-4fde0bf037fc","Type":"ContainerStarted","Data":"7c87d26157cab0336c08a51c3e6cf3d8641961b6c17d93b07405c62a5c728810"} Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.102122 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37ed6c08-efb6-4db2-98ec-c881d5ea03ce-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"37ed6c08-efb6-4db2-98ec-c881d5ea03ce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.102217 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37ed6c08-efb6-4db2-98ec-c881d5ea03ce-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"37ed6c08-efb6-4db2-98ec-c881d5ea03ce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.102285 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37ed6c08-efb6-4db2-98ec-c881d5ea03ce-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"37ed6c08-efb6-4db2-98ec-c881d5ea03ce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.102604 4745 generic.go:334] "Generic (PLEG): container finished" podID="7e36127f-1987-4067-bf22-38a1bc134721" containerID="850e0ce396bfb25e9139603a98ea9dee8197373b47d6e2a9bcf6a8eb7d72702b" exitCode=0 Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.102674 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt7zb" event={"ID":"7e36127f-1987-4067-bf22-38a1bc134721","Type":"ContainerDied","Data":"850e0ce396bfb25e9139603a98ea9dee8197373b47d6e2a9bcf6a8eb7d72702b"} Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.102698 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt7zb" event={"ID":"7e36127f-1987-4067-bf22-38a1bc134721","Type":"ContainerStarted","Data":"526d902a9bdc618f0334d274226e0085bd68f5545106e478b23b06fd694bb71e"} Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.103869 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mvnf" event={"ID":"b1305324-76f0-474c-8933-599d9b6eaff4","Type":"ContainerStarted","Data":"cde86d328f67b09dccf2b15257300577b452a9dced8449be9e8211db62682165"} Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.106713 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-r79qf" event={"ID":"b5078140-21f5-4159-96af-69a29c27ec36","Type":"ContainerStarted","Data":"ddc4465d3879388ed7f32d36dcb8d3ce038419fb5849f31f16271ab5cfba6333"} Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.117430 4745 generic.go:334] "Generic (PLEG): container finished" podID="81940df1-47e0-46fa-9215-5751defbf8c0" containerID="7575e78a483a721aad8ea590d9ab4503977a81b22bd5a99226bb766c97acbeac" exitCode=0 Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.117493 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5fp7" event={"ID":"81940df1-47e0-46fa-9215-5751defbf8c0","Type":"ContainerDied","Data":"7575e78a483a721aad8ea590d9ab4503977a81b22bd5a99226bb766c97acbeac"} Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.117558 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5fp7" event={"ID":"81940df1-47e0-46fa-9215-5751defbf8c0","Type":"ContainerStarted","Data":"b42afdf27f5dfb60e05c782b8ac0939dcb7f0e7b854d9b00d8810db002d9d4dc"} Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.122651 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.129995 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2680e3089971db3a34fe55c4ebae84b140fbd77db3efbb720c0ab2d2b3270760"} Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.130822 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37ed6c08-efb6-4db2-98ec-c881d5ea03ce-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"37ed6c08-efb6-4db2-98ec-c881d5ea03ce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.132653 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-r79qf" podStartSLOduration=13.132636999 podStartE2EDuration="13.132636999s" podCreationTimestamp="2025-12-09 11:34:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:22.130194642 +0000 UTC m=+148.955396166" watchObservedRunningTime="2025-12-09 11:34:22.132636999 +0000 UTC m=+148.957838523" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.141904 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"fd1cb2dbfb62c387a2aa6acfb56f838af509c16717a8d1c5435033d768f80a47"} Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.142179 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"65f83691c837798eec8b0ba3ede63c0138ba1f39dd8e666c352221bd65109284"} Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.144314 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0b81788546cf71f43f3c769cc2f188743a7e346510fe65815742a32c6e31b1a4"} Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.144431 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"396646a6bb5da780678a3891e79a74a618bcb70158e9ff7ae25a28a425093823"} Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.144718 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.148537 4745 generic.go:334] "Generic (PLEG): container finished" podID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" containerID="d88f7982963daf715e820057f69d7639aa7334698d19875c4552890502f15c57" exitCode=0 Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.148591 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrmwh" event={"ID":"e2ab340c-642b-4f8d-bc1f-adebf5e79418","Type":"ContainerDied","Data":"d88f7982963daf715e820057f69d7639aa7334698d19875c4552890502f15c57"} Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.149238 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrmwh" event={"ID":"e2ab340c-642b-4f8d-bc1f-adebf5e79418","Type":"ContainerStarted","Data":"28742e5cd1422fb841f565d0f56f476bc100a285b0d2584cb0ffa86173928ecb"} Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.204743 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwmhg\" (UniqueName: \"kubernetes.io/projected/a216af0b-0937-4319-97bc-9d180389b873-kube-api-access-wwmhg\") pod \"redhat-marketplace-xnwjq\" (UID: \"a216af0b-0937-4319-97bc-9d180389b873\") " pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.204844 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a216af0b-0937-4319-97bc-9d180389b873-catalog-content\") pod \"redhat-marketplace-xnwjq\" (UID: \"a216af0b-0937-4319-97bc-9d180389b873\") " pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.204871 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a216af0b-0937-4319-97bc-9d180389b873-utilities\") pod \"redhat-marketplace-xnwjq\" (UID: \"a216af0b-0937-4319-97bc-9d180389b873\") " pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.272480 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.306335 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwmhg\" (UniqueName: \"kubernetes.io/projected/a216af0b-0937-4319-97bc-9d180389b873-kube-api-access-wwmhg\") pod \"redhat-marketplace-xnwjq\" (UID: \"a216af0b-0937-4319-97bc-9d180389b873\") " pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.306400 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a216af0b-0937-4319-97bc-9d180389b873-catalog-content\") pod \"redhat-marketplace-xnwjq\" (UID: \"a216af0b-0937-4319-97bc-9d180389b873\") " pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.306426 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a216af0b-0937-4319-97bc-9d180389b873-utilities\") pod \"redhat-marketplace-xnwjq\" (UID: \"a216af0b-0937-4319-97bc-9d180389b873\") " pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.306801 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a216af0b-0937-4319-97bc-9d180389b873-utilities\") pod \"redhat-marketplace-xnwjq\" (UID: \"a216af0b-0937-4319-97bc-9d180389b873\") " pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.307947 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a216af0b-0937-4319-97bc-9d180389b873-catalog-content\") pod \"redhat-marketplace-xnwjq\" (UID: \"a216af0b-0937-4319-97bc-9d180389b873\") " pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.313381 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.313455 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.313867 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.340272 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.367537 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwmhg\" (UniqueName: \"kubernetes.io/projected/a216af0b-0937-4319-97bc-9d180389b873-kube-api-access-wwmhg\") pod \"redhat-marketplace-xnwjq\" (UID: \"a216af0b-0937-4319-97bc-9d180389b873\") " pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.475036 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hgrls"] Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.478166 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.479893 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgrls"] Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.508560 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4dtm\" (UniqueName: \"kubernetes.io/projected/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-kube-api-access-d4dtm\") pod \"redhat-marketplace-hgrls\" (UID: \"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05\") " pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.508621 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-utilities\") pod \"redhat-marketplace-hgrls\" (UID: \"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05\") " pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.508671 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-catalog-content\") pod \"redhat-marketplace-hgrls\" (UID: \"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05\") " pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.552624 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.566962 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.614498 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krf6l\" (UniqueName: \"kubernetes.io/projected/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-kube-api-access-krf6l\") pod \"ecd48ed5-6e8d-40f5-b3d1-1254df80a033\" (UID: \"ecd48ed5-6e8d-40f5-b3d1-1254df80a033\") " Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.614599 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-secret-volume\") pod \"ecd48ed5-6e8d-40f5-b3d1-1254df80a033\" (UID: \"ecd48ed5-6e8d-40f5-b3d1-1254df80a033\") " Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.614631 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-config-volume\") pod \"ecd48ed5-6e8d-40f5-b3d1-1254df80a033\" (UID: \"ecd48ed5-6e8d-40f5-b3d1-1254df80a033\") " Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.614890 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4dtm\" (UniqueName: \"kubernetes.io/projected/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-kube-api-access-d4dtm\") pod \"redhat-marketplace-hgrls\" (UID: \"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05\") " pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.614934 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-utilities\") pod \"redhat-marketplace-hgrls\" (UID: \"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05\") " pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.614989 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-catalog-content\") pod \"redhat-marketplace-hgrls\" (UID: \"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05\") " pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.615437 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-catalog-content\") pod \"redhat-marketplace-hgrls\" (UID: \"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05\") " pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.627457 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ecd48ed5-6e8d-40f5-b3d1-1254df80a033" (UID: "ecd48ed5-6e8d-40f5-b3d1-1254df80a033"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.627584 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-kube-api-access-krf6l" (OuterVolumeSpecName: "kube-api-access-krf6l") pod "ecd48ed5-6e8d-40f5-b3d1-1254df80a033" (UID: "ecd48ed5-6e8d-40f5-b3d1-1254df80a033"). InnerVolumeSpecName "kube-api-access-krf6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.627832 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-config-volume" (OuterVolumeSpecName: "config-volume") pod "ecd48ed5-6e8d-40f5-b3d1-1254df80a033" (UID: "ecd48ed5-6e8d-40f5-b3d1-1254df80a033"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.627929 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-utilities\") pod \"redhat-marketplace-hgrls\" (UID: \"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05\") " pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.654626 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4dtm\" (UniqueName: \"kubernetes.io/projected/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-kube-api-access-d4dtm\") pod \"redhat-marketplace-hgrls\" (UID: \"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05\") " pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.716032 4745 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.716067 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.716079 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krf6l\" (UniqueName: \"kubernetes.io/projected/ecd48ed5-6e8d-40f5-b3d1-1254df80a033-kube-api-access-krf6l\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.801315 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.896526 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.906648 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.910063 4745 patch_prober.go:28] interesting pod/router-default-5444994796-mnbl6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 11:34:22 crc kubenswrapper[4745]: [+]has-synced ok Dec 09 11:34:22 crc kubenswrapper[4745]: [+]process-running ok Dec 09 11:34:22 crc kubenswrapper[4745]: healthz check failed Dec 09 11:34:22 crc kubenswrapper[4745]: I1209 11:34:22.910123 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mnbl6" podUID="852830c1-9fd1-4c23-807a-fef5c5934c82" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.055223 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cn2sj"] Dec 09 11:34:23 crc kubenswrapper[4745]: E1209 11:34:23.055448 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd48ed5-6e8d-40f5-b3d1-1254df80a033" containerName="collect-profiles" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.055461 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd48ed5-6e8d-40f5-b3d1-1254df80a033" containerName="collect-profiles" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.055587 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecd48ed5-6e8d-40f5-b3d1-1254df80a033" containerName="collect-profiles" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.056273 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.066570 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.076123 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cn2sj"] Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.222278 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" event={"ID":"e774e15d-2c99-453e-9c78-4fde0bf037fc","Type":"ContainerStarted","Data":"43a3253e707964f834b1b644529c75e56c9463ad825625d843c8a25930c49e4c"} Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.222713 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.232397 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daac685e-ee53-47ff-b3c4-7c999567e5cb-utilities\") pod \"redhat-operators-cn2sj\" (UID: \"daac685e-ee53-47ff-b3c4-7c999567e5cb\") " pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.232469 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daac685e-ee53-47ff-b3c4-7c999567e5cb-catalog-content\") pod \"redhat-operators-cn2sj\" (UID: \"daac685e-ee53-47ff-b3c4-7c999567e5cb\") " pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.232533 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq8hn\" (UniqueName: \"kubernetes.io/projected/daac685e-ee53-47ff-b3c4-7c999567e5cb-kube-api-access-xq8hn\") pod \"redhat-operators-cn2sj\" (UID: \"daac685e-ee53-47ff-b3c4-7c999567e5cb\") " pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.253968 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"19b49a763b7380abb5865660845e0e1b46761c27c01bb64b57588ebeecdda030"} Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.261230 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"37ed6c08-efb6-4db2-98ec-c881d5ea03ce","Type":"ContainerStarted","Data":"0451d3f1babec92c9890444a47d628d1116ecdbdab1b812cf1b7e26511f5a751"} Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.263588 4745 generic.go:334] "Generic (PLEG): container finished" podID="b1305324-76f0-474c-8933-599d9b6eaff4" containerID="1f0808138f4182b8b39ed2f8d1e3c38a9bed54a3509412aea7c52ae6e6da84b7" exitCode=0 Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.263647 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mvnf" event={"ID":"b1305324-76f0-474c-8933-599d9b6eaff4","Type":"ContainerDied","Data":"1f0808138f4182b8b39ed2f8d1e3c38a9bed54a3509412aea7c52ae6e6da84b7"} Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.274457 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" podStartSLOduration=130.274439375 podStartE2EDuration="2m10.274439375s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:23.250105819 +0000 UTC m=+150.075307343" watchObservedRunningTime="2025-12-09 11:34:23.274439375 +0000 UTC m=+150.099640899" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.288017 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" event={"ID":"ecd48ed5-6e8d-40f5-b3d1-1254df80a033","Type":"ContainerDied","Data":"701d312fd1f75eaa0d7b74f5ab33d7dfdb9a91261a5c16fbdddb393df912285c"} Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.288055 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="701d312fd1f75eaa0d7b74f5ab33d7dfdb9a91261a5c16fbdddb393df912285c" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.288095 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.308022 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r6r9g" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.327118 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xnwjq"] Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.334372 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daac685e-ee53-47ff-b3c4-7c999567e5cb-utilities\") pod \"redhat-operators-cn2sj\" (UID: \"daac685e-ee53-47ff-b3c4-7c999567e5cb\") " pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.334463 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daac685e-ee53-47ff-b3c4-7c999567e5cb-catalog-content\") pod \"redhat-operators-cn2sj\" (UID: \"daac685e-ee53-47ff-b3c4-7c999567e5cb\") " pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.334554 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq8hn\" (UniqueName: \"kubernetes.io/projected/daac685e-ee53-47ff-b3c4-7c999567e5cb-kube-api-access-xq8hn\") pod \"redhat-operators-cn2sj\" (UID: \"daac685e-ee53-47ff-b3c4-7c999567e5cb\") " pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.339745 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daac685e-ee53-47ff-b3c4-7c999567e5cb-utilities\") pod \"redhat-operators-cn2sj\" (UID: \"daac685e-ee53-47ff-b3c4-7c999567e5cb\") " pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.340451 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daac685e-ee53-47ff-b3c4-7c999567e5cb-catalog-content\") pod \"redhat-operators-cn2sj\" (UID: \"daac685e-ee53-47ff-b3c4-7c999567e5cb\") " pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.405964 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq8hn\" (UniqueName: \"kubernetes.io/projected/daac685e-ee53-47ff-b3c4-7c999567e5cb-kube-api-access-xq8hn\") pod \"redhat-operators-cn2sj\" (UID: \"daac685e-ee53-47ff-b3c4-7c999567e5cb\") " pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.446054 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5fc6n"] Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.450439 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.505397 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fc6n"] Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.644174 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf72de4-c628-4b28-9bfc-1d6a874a6575-utilities\") pod \"redhat-operators-5fc6n\" (UID: \"0bf72de4-c628-4b28-9bfc-1d6a874a6575\") " pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.644252 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9svms\" (UniqueName: \"kubernetes.io/projected/0bf72de4-c628-4b28-9bfc-1d6a874a6575-kube-api-access-9svms\") pod \"redhat-operators-5fc6n\" (UID: \"0bf72de4-c628-4b28-9bfc-1d6a874a6575\") " pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.644327 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf72de4-c628-4b28-9bfc-1d6a874a6575-catalog-content\") pod \"redhat-operators-5fc6n\" (UID: \"0bf72de4-c628-4b28-9bfc-1d6a874a6575\") " pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.662554 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgrls"] Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.696004 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.747967 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf72de4-c628-4b28-9bfc-1d6a874a6575-utilities\") pod \"redhat-operators-5fc6n\" (UID: \"0bf72de4-c628-4b28-9bfc-1d6a874a6575\") " pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.748009 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9svms\" (UniqueName: \"kubernetes.io/projected/0bf72de4-c628-4b28-9bfc-1d6a874a6575-kube-api-access-9svms\") pod \"redhat-operators-5fc6n\" (UID: \"0bf72de4-c628-4b28-9bfc-1d6a874a6575\") " pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.748075 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf72de4-c628-4b28-9bfc-1d6a874a6575-catalog-content\") pod \"redhat-operators-5fc6n\" (UID: \"0bf72de4-c628-4b28-9bfc-1d6a874a6575\") " pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.748618 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf72de4-c628-4b28-9bfc-1d6a874a6575-catalog-content\") pod \"redhat-operators-5fc6n\" (UID: \"0bf72de4-c628-4b28-9bfc-1d6a874a6575\") " pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.757122 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf72de4-c628-4b28-9bfc-1d6a874a6575-utilities\") pod \"redhat-operators-5fc6n\" (UID: \"0bf72de4-c628-4b28-9bfc-1d6a874a6575\") " pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.787309 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9svms\" (UniqueName: \"kubernetes.io/projected/0bf72de4-c628-4b28-9bfc-1d6a874a6575-kube-api-access-9svms\") pod \"redhat-operators-5fc6n\" (UID: \"0bf72de4-c628-4b28-9bfc-1d6a874a6575\") " pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.839477 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.905675 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:23 crc kubenswrapper[4745]: I1209 11:34:23.924375 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-mnbl6" Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.315976 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.318571 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.320099 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.325243 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.325571 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.417494 4745 generic.go:334] "Generic (PLEG): container finished" podID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" containerID="6407bee20a614bc4fdbd3ebccd99d41430393994b48b0a4fa7abc9708b4e67da" exitCode=0 Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.417708 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgrls" event={"ID":"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05","Type":"ContainerDied","Data":"6407bee20a614bc4fdbd3ebccd99d41430393994b48b0a4fa7abc9708b4e67da"} Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.417779 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgrls" event={"ID":"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05","Type":"ContainerStarted","Data":"af6d5eb0b447bc27f63bb597ade46cba090da2e9e521b0673c6df940a4b116de"} Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.426019 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"37ed6c08-efb6-4db2-98ec-c881d5ea03ce","Type":"ContainerStarted","Data":"b5e13e15c1d6d2637fecfa7e3651086b110738a8285ff038657f6edc1a65e969"} Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.438350 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cn2sj"] Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.449609 4745 generic.go:334] "Generic (PLEG): container finished" podID="a216af0b-0937-4319-97bc-9d180389b873" containerID="b1802f63dac16f2ce7c1e7cea0c0119508c516f81fd2506fd1079bbfa9fe5738" exitCode=0 Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.450629 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xnwjq" event={"ID":"a216af0b-0937-4319-97bc-9d180389b873","Type":"ContainerDied","Data":"b1802f63dac16f2ce7c1e7cea0c0119508c516f81fd2506fd1079bbfa9fe5738"} Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.450665 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xnwjq" event={"ID":"a216af0b-0937-4319-97bc-9d180389b873","Type":"ContainerStarted","Data":"17440cb627e36b234de495ceced756cfd393a4cab136392c0659c509379aa44c"} Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.467674 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1636108b-8084-45df-bd42-b0b61ff08892-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1636108b-8084-45df-bd42-b0b61ff08892\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.467775 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1636108b-8084-45df-bd42-b0b61ff08892-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1636108b-8084-45df-bd42-b0b61ff08892\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.487902 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.487877109 podStartE2EDuration="3.487877109s" podCreationTimestamp="2025-12-09 11:34:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:24.46738625 +0000 UTC m=+151.292587774" watchObservedRunningTime="2025-12-09 11:34:24.487877109 +0000 UTC m=+151.313078633" Dec 09 11:34:24 crc kubenswrapper[4745]: W1209 11:34:24.527193 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaac685e_ee53_47ff_b3c4_7c999567e5cb.slice/crio-be2c85a59a5b6e26ccbeb03a02a9d6171db2fb57393b87b5f206c10910c839d1 WatchSource:0}: Error finding container be2c85a59a5b6e26ccbeb03a02a9d6171db2fb57393b87b5f206c10910c839d1: Status 404 returned error can't find the container with id be2c85a59a5b6e26ccbeb03a02a9d6171db2fb57393b87b5f206c10910c839d1 Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.569308 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1636108b-8084-45df-bd42-b0b61ff08892-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1636108b-8084-45df-bd42-b0b61ff08892\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.569826 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1636108b-8084-45df-bd42-b0b61ff08892-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1636108b-8084-45df-bd42-b0b61ff08892\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.571866 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1636108b-8084-45df-bd42-b0b61ff08892-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1636108b-8084-45df-bd42-b0b61ff08892\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.599243 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1636108b-8084-45df-bd42-b0b61ff08892-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1636108b-8084-45df-bd42-b0b61ff08892\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.663498 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 11:34:24 crc kubenswrapper[4745]: I1209 11:34:24.666901 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fc6n"] Dec 09 11:34:25 crc kubenswrapper[4745]: I1209 11:34:25.332219 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 09 11:34:25 crc kubenswrapper[4745]: W1209 11:34:25.424105 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1636108b_8084_45df_bd42_b0b61ff08892.slice/crio-1bfb5428ed8f8fb10871a3c8098176743533b10a973155292f3d16fa2a7080aa WatchSource:0}: Error finding container 1bfb5428ed8f8fb10871a3c8098176743533b10a973155292f3d16fa2a7080aa: Status 404 returned error can't find the container with id 1bfb5428ed8f8fb10871a3c8098176743533b10a973155292f3d16fa2a7080aa Dec 09 11:34:25 crc kubenswrapper[4745]: I1209 11:34:25.475596 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:34:25 crc kubenswrapper[4745]: I1209 11:34:25.475935 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:34:25 crc kubenswrapper[4745]: I1209 11:34:25.479327 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fc6n" event={"ID":"0bf72de4-c628-4b28-9bfc-1d6a874a6575","Type":"ContainerStarted","Data":"0c58a649554864a5320817d2f8db37d32ec6977ba7f9073021eda274163e66a8"} Dec 09 11:34:25 crc kubenswrapper[4745]: I1209 11:34:25.489375 4745 generic.go:334] "Generic (PLEG): container finished" podID="37ed6c08-efb6-4db2-98ec-c881d5ea03ce" containerID="b5e13e15c1d6d2637fecfa7e3651086b110738a8285ff038657f6edc1a65e969" exitCode=0 Dec 09 11:34:25 crc kubenswrapper[4745]: I1209 11:34:25.489449 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"37ed6c08-efb6-4db2-98ec-c881d5ea03ce","Type":"ContainerDied","Data":"b5e13e15c1d6d2637fecfa7e3651086b110738a8285ff038657f6edc1a65e969"} Dec 09 11:34:25 crc kubenswrapper[4745]: I1209 11:34:25.495115 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1636108b-8084-45df-bd42-b0b61ff08892","Type":"ContainerStarted","Data":"1bfb5428ed8f8fb10871a3c8098176743533b10a973155292f3d16fa2a7080aa"} Dec 09 11:34:25 crc kubenswrapper[4745]: I1209 11:34:25.506831 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn2sj" event={"ID":"daac685e-ee53-47ff-b3c4-7c999567e5cb","Type":"ContainerStarted","Data":"9201bd8dbc866471141fa095357a5f7d41a07aafc608c8e0a84572e1a9673e9e"} Dec 09 11:34:25 crc kubenswrapper[4745]: I1209 11:34:25.506888 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn2sj" event={"ID":"daac685e-ee53-47ff-b3c4-7c999567e5cb","Type":"ContainerStarted","Data":"be2c85a59a5b6e26ccbeb03a02a9d6171db2fb57393b87b5f206c10910c839d1"} Dec 09 11:34:26 crc kubenswrapper[4745]: I1209 11:34:26.522037 4745 generic.go:334] "Generic (PLEG): container finished" podID="daac685e-ee53-47ff-b3c4-7c999567e5cb" containerID="9201bd8dbc866471141fa095357a5f7d41a07aafc608c8e0a84572e1a9673e9e" exitCode=0 Dec 09 11:34:26 crc kubenswrapper[4745]: I1209 11:34:26.522331 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn2sj" event={"ID":"daac685e-ee53-47ff-b3c4-7c999567e5cb","Type":"ContainerDied","Data":"9201bd8dbc866471141fa095357a5f7d41a07aafc608c8e0a84572e1a9673e9e"} Dec 09 11:34:26 crc kubenswrapper[4745]: I1209 11:34:26.539070 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fc6n" event={"ID":"0bf72de4-c628-4b28-9bfc-1d6a874a6575","Type":"ContainerDied","Data":"3af8ed8234440e4b4b66c4b4f1aa037d6c28c663c220f11fd5a48e0af8137604"} Dec 09 11:34:26 crc kubenswrapper[4745]: I1209 11:34:26.540169 4745 generic.go:334] "Generic (PLEG): container finished" podID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" containerID="3af8ed8234440e4b4b66c4b4f1aa037d6c28c663c220f11fd5a48e0af8137604" exitCode=0 Dec 09 11:34:26 crc kubenswrapper[4745]: I1209 11:34:26.547199 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1636108b-8084-45df-bd42-b0b61ff08892","Type":"ContainerStarted","Data":"b038980df8ead3b00626908554932988d366ed98abc6857387f697fc96b5edf8"} Dec 09 11:34:26 crc kubenswrapper[4745]: I1209 11:34:26.585007 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.584992078 podStartE2EDuration="2.584992078s" podCreationTimestamp="2025-12-09 11:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:34:26.582290193 +0000 UTC m=+153.407491717" watchObservedRunningTime="2025-12-09 11:34:26.584992078 +0000 UTC m=+153.410193602" Dec 09 11:34:27 crc kubenswrapper[4745]: I1209 11:34:27.259782 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 11:34:27 crc kubenswrapper[4745]: I1209 11:34:27.368903 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37ed6c08-efb6-4db2-98ec-c881d5ea03ce-kubelet-dir\") pod \"37ed6c08-efb6-4db2-98ec-c881d5ea03ce\" (UID: \"37ed6c08-efb6-4db2-98ec-c881d5ea03ce\") " Dec 09 11:34:27 crc kubenswrapper[4745]: I1209 11:34:27.368986 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37ed6c08-efb6-4db2-98ec-c881d5ea03ce-kube-api-access\") pod \"37ed6c08-efb6-4db2-98ec-c881d5ea03ce\" (UID: \"37ed6c08-efb6-4db2-98ec-c881d5ea03ce\") " Dec 09 11:34:27 crc kubenswrapper[4745]: I1209 11:34:27.369494 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37ed6c08-efb6-4db2-98ec-c881d5ea03ce-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "37ed6c08-efb6-4db2-98ec-c881d5ea03ce" (UID: "37ed6c08-efb6-4db2-98ec-c881d5ea03ce"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:34:27 crc kubenswrapper[4745]: I1209 11:34:27.396689 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ed6c08-efb6-4db2-98ec-c881d5ea03ce-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "37ed6c08-efb6-4db2-98ec-c881d5ea03ce" (UID: "37ed6c08-efb6-4db2-98ec-c881d5ea03ce"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:34:27 crc kubenswrapper[4745]: I1209 11:34:27.471001 4745 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37ed6c08-efb6-4db2-98ec-c881d5ea03ce-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:27 crc kubenswrapper[4745]: I1209 11:34:27.471034 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37ed6c08-efb6-4db2-98ec-c881d5ea03ce-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:27 crc kubenswrapper[4745]: I1209 11:34:27.560200 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 11:34:27 crc kubenswrapper[4745]: I1209 11:34:27.568625 4745 generic.go:334] "Generic (PLEG): container finished" podID="1636108b-8084-45df-bd42-b0b61ff08892" containerID="b038980df8ead3b00626908554932988d366ed98abc6857387f697fc96b5edf8" exitCode=0 Dec 09 11:34:27 crc kubenswrapper[4745]: I1209 11:34:27.585060 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"37ed6c08-efb6-4db2-98ec-c881d5ea03ce","Type":"ContainerDied","Data":"0451d3f1babec92c9890444a47d628d1116ecdbdab1b812cf1b7e26511f5a751"} Dec 09 11:34:27 crc kubenswrapper[4745]: I1209 11:34:27.585105 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0451d3f1babec92c9890444a47d628d1116ecdbdab1b812cf1b7e26511f5a751" Dec 09 11:34:27 crc kubenswrapper[4745]: I1209 11:34:27.585121 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1636108b-8084-45df-bd42-b0b61ff08892","Type":"ContainerDied","Data":"b038980df8ead3b00626908554932988d366ed98abc6857387f697fc96b5edf8"} Dec 09 11:34:27 crc kubenswrapper[4745]: I1209 11:34:27.940425 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jv8nd" Dec 09 11:34:29 crc kubenswrapper[4745]: I1209 11:34:29.222992 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 11:34:29 crc kubenswrapper[4745]: I1209 11:34:29.339134 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1636108b-8084-45df-bd42-b0b61ff08892-kubelet-dir\") pod \"1636108b-8084-45df-bd42-b0b61ff08892\" (UID: \"1636108b-8084-45df-bd42-b0b61ff08892\") " Dec 09 11:34:29 crc kubenswrapper[4745]: I1209 11:34:29.339249 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1636108b-8084-45df-bd42-b0b61ff08892-kube-api-access\") pod \"1636108b-8084-45df-bd42-b0b61ff08892\" (UID: \"1636108b-8084-45df-bd42-b0b61ff08892\") " Dec 09 11:34:29 crc kubenswrapper[4745]: I1209 11:34:29.340538 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1636108b-8084-45df-bd42-b0b61ff08892-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1636108b-8084-45df-bd42-b0b61ff08892" (UID: "1636108b-8084-45df-bd42-b0b61ff08892"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:34:29 crc kubenswrapper[4745]: I1209 11:34:29.362048 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1636108b-8084-45df-bd42-b0b61ff08892-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1636108b-8084-45df-bd42-b0b61ff08892" (UID: "1636108b-8084-45df-bd42-b0b61ff08892"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:34:29 crc kubenswrapper[4745]: I1209 11:34:29.440701 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1636108b-8084-45df-bd42-b0b61ff08892-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:29 crc kubenswrapper[4745]: I1209 11:34:29.440753 4745 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1636108b-8084-45df-bd42-b0b61ff08892-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 11:34:29 crc kubenswrapper[4745]: I1209 11:34:29.626482 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1636108b-8084-45df-bd42-b0b61ff08892","Type":"ContainerDied","Data":"1bfb5428ed8f8fb10871a3c8098176743533b10a973155292f3d16fa2a7080aa"} Dec 09 11:34:29 crc kubenswrapper[4745]: I1209 11:34:29.626552 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bfb5428ed8f8fb10871a3c8098176743533b10a973155292f3d16fa2a7080aa" Dec 09 11:34:29 crc kubenswrapper[4745]: I1209 11:34:29.626590 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 11:34:31 crc kubenswrapper[4745]: I1209 11:34:31.787492 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:34:31 crc kubenswrapper[4745]: I1209 11:34:31.787869 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:34:31 crc kubenswrapper[4745]: I1209 11:34:31.787539 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:34:31 crc kubenswrapper[4745]: I1209 11:34:31.787986 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:34:31 crc kubenswrapper[4745]: I1209 11:34:31.798748 4745 patch_prober.go:28] interesting pod/console-f9d7485db-mcrk5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 09 11:34:31 crc kubenswrapper[4745]: I1209 11:34:31.798796 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-mcrk5" podUID="8c51afdf-fde4-4147-814c-8befb1ad7d1f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 09 11:34:35 crc kubenswrapper[4745]: I1209 11:34:35.444210 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs\") pod \"network-metrics-daemon-jdv4j\" (UID: \"ea6befdd-80ca-42c2-813f-62a5cdff9605\") " pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:34:35 crc kubenswrapper[4745]: I1209 11:34:35.454290 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea6befdd-80ca-42c2-813f-62a5cdff9605-metrics-certs\") pod \"network-metrics-daemon-jdv4j\" (UID: \"ea6befdd-80ca-42c2-813f-62a5cdff9605\") " pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:34:35 crc kubenswrapper[4745]: I1209 11:34:35.497160 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdv4j" Dec 09 11:34:41 crc kubenswrapper[4745]: I1209 11:34:41.288228 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:34:41 crc kubenswrapper[4745]: I1209 11:34:41.788209 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:34:41 crc kubenswrapper[4745]: I1209 11:34:41.788232 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:34:41 crc kubenswrapper[4745]: I1209 11:34:41.788262 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:34:41 crc kubenswrapper[4745]: I1209 11:34:41.788278 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:34:41 crc kubenswrapper[4745]: I1209 11:34:41.788301 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-8r64w" Dec 09 11:34:41 crc kubenswrapper[4745]: I1209 11:34:41.788816 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"17bb6bf6f4f53c1dae1dc0ed92331c352fde91492ad6c1ea2cbf88baddc39593"} pod="openshift-console/downloads-7954f5f757-8r64w" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 09 11:34:41 crc kubenswrapper[4745]: I1209 11:34:41.788877 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:34:41 crc kubenswrapper[4745]: I1209 11:34:41.788947 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:34:41 crc kubenswrapper[4745]: I1209 11:34:41.788888 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" containerID="cri-o://17bb6bf6f4f53c1dae1dc0ed92331c352fde91492ad6c1ea2cbf88baddc39593" gracePeriod=2 Dec 09 11:34:41 crc kubenswrapper[4745]: I1209 11:34:41.801248 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:41 crc kubenswrapper[4745]: I1209 11:34:41.805054 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:34:43 crc kubenswrapper[4745]: I1209 11:34:43.792267 4745 generic.go:334] "Generic (PLEG): container finished" podID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerID="17bb6bf6f4f53c1dae1dc0ed92331c352fde91492ad6c1ea2cbf88baddc39593" exitCode=0 Dec 09 11:34:43 crc kubenswrapper[4745]: I1209 11:34:43.792360 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8r64w" event={"ID":"d02382a6-7e0e-4274-bbe2-e713ac39756c","Type":"ContainerDied","Data":"17bb6bf6f4f53c1dae1dc0ed92331c352fde91492ad6c1ea2cbf88baddc39593"} Dec 09 11:34:51 crc kubenswrapper[4745]: I1209 11:34:51.787808 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:34:51 crc kubenswrapper[4745]: I1209 11:34:51.788206 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:34:52 crc kubenswrapper[4745]: I1209 11:34:52.835055 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n7w49" Dec 09 11:34:55 crc kubenswrapper[4745]: I1209 11:34:55.475375 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:34:55 crc kubenswrapper[4745]: I1209 11:34:55.475733 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:34:59 crc kubenswrapper[4745]: I1209 11:34:59.103715 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 09 11:34:59 crc kubenswrapper[4745]: E1209 11:34:59.104358 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1636108b-8084-45df-bd42-b0b61ff08892" containerName="pruner" Dec 09 11:34:59 crc kubenswrapper[4745]: I1209 11:34:59.104371 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="1636108b-8084-45df-bd42-b0b61ff08892" containerName="pruner" Dec 09 11:34:59 crc kubenswrapper[4745]: E1209 11:34:59.104380 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ed6c08-efb6-4db2-98ec-c881d5ea03ce" containerName="pruner" Dec 09 11:34:59 crc kubenswrapper[4745]: I1209 11:34:59.104386 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ed6c08-efb6-4db2-98ec-c881d5ea03ce" containerName="pruner" Dec 09 11:34:59 crc kubenswrapper[4745]: I1209 11:34:59.104471 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="1636108b-8084-45df-bd42-b0b61ff08892" containerName="pruner" Dec 09 11:34:59 crc kubenswrapper[4745]: I1209 11:34:59.104487 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ed6c08-efb6-4db2-98ec-c881d5ea03ce" containerName="pruner" Dec 09 11:34:59 crc kubenswrapper[4745]: I1209 11:34:59.104879 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 11:34:59 crc kubenswrapper[4745]: I1209 11:34:59.122424 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 09 11:34:59 crc kubenswrapper[4745]: I1209 11:34:59.122656 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 09 11:34:59 crc kubenswrapper[4745]: I1209 11:34:59.127257 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 09 11:34:59 crc kubenswrapper[4745]: I1209 11:34:59.231343 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b661779-83ff-479f-9bfa-e2352640c734-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4b661779-83ff-479f-9bfa-e2352640c734\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 11:34:59 crc kubenswrapper[4745]: I1209 11:34:59.231668 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b661779-83ff-479f-9bfa-e2352640c734-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4b661779-83ff-479f-9bfa-e2352640c734\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 11:34:59 crc kubenswrapper[4745]: I1209 11:34:59.332941 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b661779-83ff-479f-9bfa-e2352640c734-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4b661779-83ff-479f-9bfa-e2352640c734\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 11:34:59 crc kubenswrapper[4745]: I1209 11:34:59.333034 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b661779-83ff-479f-9bfa-e2352640c734-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4b661779-83ff-479f-9bfa-e2352640c734\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 11:34:59 crc kubenswrapper[4745]: I1209 11:34:59.333134 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b661779-83ff-479f-9bfa-e2352640c734-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4b661779-83ff-479f-9bfa-e2352640c734\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 11:34:59 crc kubenswrapper[4745]: I1209 11:34:59.358246 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b661779-83ff-479f-9bfa-e2352640c734-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4b661779-83ff-479f-9bfa-e2352640c734\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 11:34:59 crc kubenswrapper[4745]: I1209 11:34:59.442569 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 11:35:00 crc kubenswrapper[4745]: I1209 11:35:00.492759 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 11:35:00 crc kubenswrapper[4745]: I1209 11:35:00.893622 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jdv4j"] Dec 09 11:35:01 crc kubenswrapper[4745]: I1209 11:35:01.787604 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:35:01 crc kubenswrapper[4745]: I1209 11:35:01.787908 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:35:02 crc kubenswrapper[4745]: E1209 11:35:02.792111 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 09 11:35:02 crc kubenswrapper[4745]: E1209 11:35:02.792606 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xtvxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-2mvnf_openshift-marketplace(b1305324-76f0-474c-8933-599d9b6eaff4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 11:35:02 crc kubenswrapper[4745]: E1209 11:35:02.794038 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-2mvnf" podUID="b1305324-76f0-474c-8933-599d9b6eaff4" Dec 09 11:35:04 crc kubenswrapper[4745]: E1209 11:35:04.329872 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-2mvnf" podUID="b1305324-76f0-474c-8933-599d9b6eaff4" Dec 09 11:35:04 crc kubenswrapper[4745]: E1209 11:35:04.416551 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 09 11:35:04 crc kubenswrapper[4745]: E1209 11:35:04.416743 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4dtm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hgrls_openshift-marketplace(d2d47c68-bfd3-4a99-afbf-7fe4f1478f05): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 11:35:04 crc kubenswrapper[4745]: E1209 11:35:04.418227 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hgrls" podUID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" Dec 09 11:35:04 crc kubenswrapper[4745]: I1209 11:35:04.733575 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 09 11:35:04 crc kubenswrapper[4745]: I1209 11:35:04.735072 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 11:35:04 crc kubenswrapper[4745]: I1209 11:35:04.749407 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d48d4949-4889-4b14-95bd-37954bd71417-var-lock\") pod \"installer-9-crc\" (UID: \"d48d4949-4889-4b14-95bd-37954bd71417\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 11:35:04 crc kubenswrapper[4745]: I1209 11:35:04.749484 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d48d4949-4889-4b14-95bd-37954bd71417-kube-api-access\") pod \"installer-9-crc\" (UID: \"d48d4949-4889-4b14-95bd-37954bd71417\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 11:35:04 crc kubenswrapper[4745]: I1209 11:35:04.749552 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d48d4949-4889-4b14-95bd-37954bd71417-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d48d4949-4889-4b14-95bd-37954bd71417\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 11:35:04 crc kubenswrapper[4745]: I1209 11:35:04.756641 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 09 11:35:04 crc kubenswrapper[4745]: I1209 11:35:04.850466 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d48d4949-4889-4b14-95bd-37954bd71417-var-lock\") pod \"installer-9-crc\" (UID: \"d48d4949-4889-4b14-95bd-37954bd71417\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 11:35:04 crc kubenswrapper[4745]: I1209 11:35:04.850575 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d48d4949-4889-4b14-95bd-37954bd71417-kube-api-access\") pod \"installer-9-crc\" (UID: \"d48d4949-4889-4b14-95bd-37954bd71417\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 11:35:04 crc kubenswrapper[4745]: I1209 11:35:04.850621 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d48d4949-4889-4b14-95bd-37954bd71417-var-lock\") pod \"installer-9-crc\" (UID: \"d48d4949-4889-4b14-95bd-37954bd71417\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 11:35:04 crc kubenswrapper[4745]: I1209 11:35:04.850679 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d48d4949-4889-4b14-95bd-37954bd71417-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d48d4949-4889-4b14-95bd-37954bd71417\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 11:35:04 crc kubenswrapper[4745]: I1209 11:35:04.850633 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d48d4949-4889-4b14-95bd-37954bd71417-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d48d4949-4889-4b14-95bd-37954bd71417\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 11:35:04 crc kubenswrapper[4745]: I1209 11:35:04.870091 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d48d4949-4889-4b14-95bd-37954bd71417-kube-api-access\") pod \"installer-9-crc\" (UID: \"d48d4949-4889-4b14-95bd-37954bd71417\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 11:35:05 crc kubenswrapper[4745]: I1209 11:35:05.050691 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 11:35:08 crc kubenswrapper[4745]: E1209 11:35:08.675023 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 09 11:35:08 crc kubenswrapper[4745]: E1209 11:35:08.675699 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xq8hn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cn2sj_openshift-marketplace(daac685e-ee53-47ff-b3c4-7c999567e5cb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 11:35:08 crc kubenswrapper[4745]: E1209 11:35:08.676861 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cn2sj" podUID="daac685e-ee53-47ff-b3c4-7c999567e5cb" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.226814 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hgrls" podUID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.227111 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cn2sj" podUID="daac685e-ee53-47ff-b3c4-7c999567e5cb" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.322499 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.322738 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dlr2m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-s5fp7_openshift-marketplace(81940df1-47e0-46fa-9215-5751defbf8c0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.323897 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-s5fp7" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.383196 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.383741 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wwmhg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-xnwjq_openshift-marketplace(a216af0b-0937-4319-97bc-9d180389b873): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.384932 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-xnwjq" podUID="a216af0b-0937-4319-97bc-9d180389b873" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.401955 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.402124 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dd6g6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-zrmwh_openshift-marketplace(e2ab340c-642b-4f8d-bc1f-adebf5e79418): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.403285 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-zrmwh" podUID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" Dec 09 11:35:10 crc kubenswrapper[4745]: I1209 11:35:10.461913 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.480147 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.480294 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nhcfj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zt7zb_openshift-marketplace(7e36127f-1987-4067-bf22-38a1bc134721): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.482933 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-zt7zb" podUID="7e36127f-1987-4067-bf22-38a1bc134721" Dec 09 11:35:10 crc kubenswrapper[4745]: W1209 11:35:10.495768 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd48d4949_4889_4b14_95bd_37954bd71417.slice/crio-c143a392e02c6827a8e335955ef44a742dd40222d18e5ac4c3688154edc0d382 WatchSource:0}: Error finding container c143a392e02c6827a8e335955ef44a742dd40222d18e5ac4c3688154edc0d382: Status 404 returned error can't find the container with id c143a392e02c6827a8e335955ef44a742dd40222d18e5ac4c3688154edc0d382 Dec 09 11:35:10 crc kubenswrapper[4745]: I1209 11:35:10.521592 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 09 11:35:10 crc kubenswrapper[4745]: W1209 11:35:10.547133 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4b661779_83ff_479f_9bfa_e2352640c734.slice/crio-0a893b3a5bc1fd586e431125eefafc8ebd03854e2f2c383e979b0d91034ba513 WatchSource:0}: Error finding container 0a893b3a5bc1fd586e431125eefafc8ebd03854e2f2c383e979b0d91034ba513: Status 404 returned error can't find the container with id 0a893b3a5bc1fd586e431125eefafc8ebd03854e2f2c383e979b0d91034ba513 Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.568247 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.568468 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9svms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5fc6n_openshift-marketplace(0bf72de4-c628-4b28-9bfc-1d6a874a6575): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.569716 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5fc6n" podUID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" Dec 09 11:35:10 crc kubenswrapper[4745]: I1209 11:35:10.985467 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4b661779-83ff-479f-9bfa-e2352640c734","Type":"ContainerStarted","Data":"019c7390600fb0caa39432805141f39798d9f6e9ff25359249669e3f050da7f4"} Dec 09 11:35:10 crc kubenswrapper[4745]: I1209 11:35:10.986014 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4b661779-83ff-479f-9bfa-e2352640c734","Type":"ContainerStarted","Data":"0a893b3a5bc1fd586e431125eefafc8ebd03854e2f2c383e979b0d91034ba513"} Dec 09 11:35:10 crc kubenswrapper[4745]: I1209 11:35:10.989064 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8r64w" event={"ID":"d02382a6-7e0e-4274-bbe2-e713ac39756c","Type":"ContainerStarted","Data":"0c98ce9e8e84b48f5b85af64576231947f1d1bdbe4c8e276f2dc73816302218a"} Dec 09 11:35:10 crc kubenswrapper[4745]: I1209 11:35:10.990104 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8r64w" Dec 09 11:35:10 crc kubenswrapper[4745]: I1209 11:35:10.990219 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:35:10 crc kubenswrapper[4745]: I1209 11:35:10.990251 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:35:10 crc kubenswrapper[4745]: I1209 11:35:10.993138 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d48d4949-4889-4b14-95bd-37954bd71417","Type":"ContainerStarted","Data":"d75f4b0403a8f0ee2c7be1e1e4d80247e74b6fa931940face5028f370a753761"} Dec 09 11:35:10 crc kubenswrapper[4745]: I1209 11:35:10.993185 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d48d4949-4889-4b14-95bd-37954bd71417","Type":"ContainerStarted","Data":"c143a392e02c6827a8e335955ef44a742dd40222d18e5ac4c3688154edc0d382"} Dec 09 11:35:10 crc kubenswrapper[4745]: I1209 11:35:10.996055 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" event={"ID":"ea6befdd-80ca-42c2-813f-62a5cdff9605","Type":"ContainerStarted","Data":"97ace247c949c0860150dfa7eebc09b149d5ca0429c1508f21e5dedb373ae84a"} Dec 09 11:35:10 crc kubenswrapper[4745]: I1209 11:35:10.996116 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" event={"ID":"ea6befdd-80ca-42c2-813f-62a5cdff9605","Type":"ContainerStarted","Data":"0e6988758b37c9bbbe2bbe37b6eb147e96f5e54b0eea0ca67cd63e128f5e1df7"} Dec 09 11:35:10 crc kubenswrapper[4745]: I1209 11:35:10.996132 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jdv4j" event={"ID":"ea6befdd-80ca-42c2-813f-62a5cdff9605","Type":"ContainerStarted","Data":"aa36c640ea4754a4bc63acbd40585b8a42c970b5bb3821b86e350cac4ec5ffa9"} Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.998213 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-xnwjq" podUID="a216af0b-0937-4319-97bc-9d180389b873" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.998213 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-s5fp7" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.999081 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5fc6n" podUID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" Dec 09 11:35:10 crc kubenswrapper[4745]: E1209 11:35:10.999411 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-zrmwh" podUID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" Dec 09 11:35:11 crc kubenswrapper[4745]: I1209 11:35:11.024421 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=12.02439536 podStartE2EDuration="12.02439536s" podCreationTimestamp="2025-12-09 11:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:35:11.003703024 +0000 UTC m=+197.828904548" watchObservedRunningTime="2025-12-09 11:35:11.02439536 +0000 UTC m=+197.849596884" Dec 09 11:35:11 crc kubenswrapper[4745]: I1209 11:35:11.037990 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jdv4j" podStartSLOduration=178.037965752 podStartE2EDuration="2m58.037965752s" podCreationTimestamp="2025-12-09 11:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:35:11.035880615 +0000 UTC m=+197.861082149" watchObservedRunningTime="2025-12-09 11:35:11.037965752 +0000 UTC m=+197.863167276" Dec 09 11:35:11 crc kubenswrapper[4745]: I1209 11:35:11.039164 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=7.039153934 podStartE2EDuration="7.039153934s" podCreationTimestamp="2025-12-09 11:35:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:35:11.022984772 +0000 UTC m=+197.848186296" watchObservedRunningTime="2025-12-09 11:35:11.039153934 +0000 UTC m=+197.864355458" Dec 09 11:35:11 crc kubenswrapper[4745]: I1209 11:35:11.787921 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:35:11 crc kubenswrapper[4745]: I1209 11:35:11.787969 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:35:11 crc kubenswrapper[4745]: I1209 11:35:11.788025 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:35:11 crc kubenswrapper[4745]: I1209 11:35:11.788082 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:35:12 crc kubenswrapper[4745]: I1209 11:35:12.008524 4745 generic.go:334] "Generic (PLEG): container finished" podID="4b661779-83ff-479f-9bfa-e2352640c734" containerID="019c7390600fb0caa39432805141f39798d9f6e9ff25359249669e3f050da7f4" exitCode=0 Dec 09 11:35:12 crc kubenswrapper[4745]: I1209 11:35:12.010383 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:35:12 crc kubenswrapper[4745]: I1209 11:35:12.010430 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:35:12 crc kubenswrapper[4745]: I1209 11:35:12.010858 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4b661779-83ff-479f-9bfa-e2352640c734","Type":"ContainerDied","Data":"019c7390600fb0caa39432805141f39798d9f6e9ff25359249669e3f050da7f4"} Dec 09 11:35:13 crc kubenswrapper[4745]: I1209 11:35:13.014647 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:35:13 crc kubenswrapper[4745]: I1209 11:35:13.014723 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:35:13 crc kubenswrapper[4745]: I1209 11:35:13.225081 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 11:35:13 crc kubenswrapper[4745]: I1209 11:35:13.377046 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b661779-83ff-479f-9bfa-e2352640c734-kubelet-dir\") pod \"4b661779-83ff-479f-9bfa-e2352640c734\" (UID: \"4b661779-83ff-479f-9bfa-e2352640c734\") " Dec 09 11:35:13 crc kubenswrapper[4745]: I1209 11:35:13.377194 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b661779-83ff-479f-9bfa-e2352640c734-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4b661779-83ff-479f-9bfa-e2352640c734" (UID: "4b661779-83ff-479f-9bfa-e2352640c734"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:35:13 crc kubenswrapper[4745]: I1209 11:35:13.377303 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b661779-83ff-479f-9bfa-e2352640c734-kube-api-access\") pod \"4b661779-83ff-479f-9bfa-e2352640c734\" (UID: \"4b661779-83ff-479f-9bfa-e2352640c734\") " Dec 09 11:35:13 crc kubenswrapper[4745]: I1209 11:35:13.377447 4745 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b661779-83ff-479f-9bfa-e2352640c734-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:13 crc kubenswrapper[4745]: I1209 11:35:13.398216 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b661779-83ff-479f-9bfa-e2352640c734-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4b661779-83ff-479f-9bfa-e2352640c734" (UID: "4b661779-83ff-479f-9bfa-e2352640c734"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:35:13 crc kubenswrapper[4745]: I1209 11:35:13.478859 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b661779-83ff-479f-9bfa-e2352640c734-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:14 crc kubenswrapper[4745]: I1209 11:35:14.019718 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4b661779-83ff-479f-9bfa-e2352640c734","Type":"ContainerDied","Data":"0a893b3a5bc1fd586e431125eefafc8ebd03854e2f2c383e979b0d91034ba513"} Dec 09 11:35:14 crc kubenswrapper[4745]: I1209 11:35:14.019758 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a893b3a5bc1fd586e431125eefafc8ebd03854e2f2c383e979b0d91034ba513" Dec 09 11:35:14 crc kubenswrapper[4745]: I1209 11:35:14.020854 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 11:35:21 crc kubenswrapper[4745]: I1209 11:35:21.787689 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:35:21 crc kubenswrapper[4745]: I1209 11:35:21.787734 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-8r64w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 09 11:35:21 crc kubenswrapper[4745]: I1209 11:35:21.788265 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:35:21 crc kubenswrapper[4745]: I1209 11:35:21.788401 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8r64w" podUID="d02382a6-7e0e-4274-bbe2-e713ac39756c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.32:8080/\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 09 11:35:23 crc kubenswrapper[4745]: I1209 11:35:23.074831 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mvnf" event={"ID":"b1305324-76f0-474c-8933-599d9b6eaff4","Type":"ContainerStarted","Data":"769394f4d78b8f966c0a17068097b1be84b808ad404d2dd6db7e98fc46911e1c"} Dec 09 11:35:23 crc kubenswrapper[4745]: I1209 11:35:23.092317 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrmwh" event={"ID":"e2ab340c-642b-4f8d-bc1f-adebf5e79418","Type":"ContainerStarted","Data":"89ceb3b77118bb1fbf2ea2a25637a57363263360760ea2ea52466c7e7081dbc8"} Dec 09 11:35:23 crc kubenswrapper[4745]: I1209 11:35:23.094940 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xnwjq" event={"ID":"a216af0b-0937-4319-97bc-9d180389b873","Type":"ContainerStarted","Data":"e9210d7e735e5ec52351ab20d682fa553685f6786e68031040a8478a6cc1ccbf"} Dec 09 11:35:24 crc kubenswrapper[4745]: I1209 11:35:24.102441 4745 generic.go:334] "Generic (PLEG): container finished" podID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" containerID="89ceb3b77118bb1fbf2ea2a25637a57363263360760ea2ea52466c7e7081dbc8" exitCode=0 Dec 09 11:35:24 crc kubenswrapper[4745]: I1209 11:35:24.102621 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrmwh" event={"ID":"e2ab340c-642b-4f8d-bc1f-adebf5e79418","Type":"ContainerDied","Data":"89ceb3b77118bb1fbf2ea2a25637a57363263360760ea2ea52466c7e7081dbc8"} Dec 09 11:35:24 crc kubenswrapper[4745]: I1209 11:35:24.107766 4745 generic.go:334] "Generic (PLEG): container finished" podID="a216af0b-0937-4319-97bc-9d180389b873" containerID="e9210d7e735e5ec52351ab20d682fa553685f6786e68031040a8478a6cc1ccbf" exitCode=0 Dec 09 11:35:24 crc kubenswrapper[4745]: I1209 11:35:24.107804 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xnwjq" event={"ID":"a216af0b-0937-4319-97bc-9d180389b873","Type":"ContainerDied","Data":"e9210d7e735e5ec52351ab20d682fa553685f6786e68031040a8478a6cc1ccbf"} Dec 09 11:35:25 crc kubenswrapper[4745]: I1209 11:35:25.121418 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn2sj" event={"ID":"daac685e-ee53-47ff-b3c4-7c999567e5cb","Type":"ContainerStarted","Data":"30649c38f18402049dfb700591e74f1ce2fad89f52934e8243fd4cf1f6a300aa"} Dec 09 11:35:25 crc kubenswrapper[4745]: I1209 11:35:25.124548 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5fp7" event={"ID":"81940df1-47e0-46fa-9215-5751defbf8c0","Type":"ContainerStarted","Data":"95257d43ff67112399a0721624d89636a6ff999474db1d908d5e8ac11b40122c"} Dec 09 11:35:25 crc kubenswrapper[4745]: I1209 11:35:25.126356 4745 generic.go:334] "Generic (PLEG): container finished" podID="b1305324-76f0-474c-8933-599d9b6eaff4" containerID="769394f4d78b8f966c0a17068097b1be84b808ad404d2dd6db7e98fc46911e1c" exitCode=0 Dec 09 11:35:25 crc kubenswrapper[4745]: I1209 11:35:25.126381 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mvnf" event={"ID":"b1305324-76f0-474c-8933-599d9b6eaff4","Type":"ContainerDied","Data":"769394f4d78b8f966c0a17068097b1be84b808ad404d2dd6db7e98fc46911e1c"} Dec 09 11:35:25 crc kubenswrapper[4745]: I1209 11:35:25.475588 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:35:25 crc kubenswrapper[4745]: I1209 11:35:25.475919 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:35:25 crc kubenswrapper[4745]: I1209 11:35:25.476065 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:35:25 crc kubenswrapper[4745]: I1209 11:35:25.476808 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:35:25 crc kubenswrapper[4745]: I1209 11:35:25.476960 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186" gracePeriod=600 Dec 09 11:35:26 crc kubenswrapper[4745]: I1209 11:35:26.133174 4745 generic.go:334] "Generic (PLEG): container finished" podID="81940df1-47e0-46fa-9215-5751defbf8c0" containerID="95257d43ff67112399a0721624d89636a6ff999474db1d908d5e8ac11b40122c" exitCode=0 Dec 09 11:35:26 crc kubenswrapper[4745]: I1209 11:35:26.133268 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5fp7" event={"ID":"81940df1-47e0-46fa-9215-5751defbf8c0","Type":"ContainerDied","Data":"95257d43ff67112399a0721624d89636a6ff999474db1d908d5e8ac11b40122c"} Dec 09 11:35:27 crc kubenswrapper[4745]: I1209 11:35:27.140618 4745 generic.go:334] "Generic (PLEG): container finished" podID="daac685e-ee53-47ff-b3c4-7c999567e5cb" containerID="30649c38f18402049dfb700591e74f1ce2fad89f52934e8243fd4cf1f6a300aa" exitCode=0 Dec 09 11:35:27 crc kubenswrapper[4745]: I1209 11:35:27.140687 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn2sj" event={"ID":"daac685e-ee53-47ff-b3c4-7c999567e5cb","Type":"ContainerDied","Data":"30649c38f18402049dfb700591e74f1ce2fad89f52934e8243fd4cf1f6a300aa"} Dec 09 11:35:27 crc kubenswrapper[4745]: I1209 11:35:27.142968 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186" exitCode=0 Dec 09 11:35:27 crc kubenswrapper[4745]: I1209 11:35:27.142994 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186"} Dec 09 11:35:31 crc kubenswrapper[4745]: I1209 11:35:31.811096 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-8r64w" Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.261091 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mvnf" event={"ID":"b1305324-76f0-474c-8933-599d9b6eaff4","Type":"ContainerStarted","Data":"4d2d623ebb49d765da10b0c661a6c339d9c0c5682e0c3ac7fed6cf539da10806"} Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.264105 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"583f595c8ce3613fba4ec668e8f573ac6a0b119c60644409502306d93aab9036"} Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.266764 4745 generic.go:334] "Generic (PLEG): container finished" podID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" containerID="46a809d4c66bc7ea55a1da13b1458e33caf7c789099e7fbb107ce2454c0c9ae7" exitCode=0 Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.266823 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgrls" event={"ID":"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05","Type":"ContainerDied","Data":"46a809d4c66bc7ea55a1da13b1458e33caf7c789099e7fbb107ce2454c0c9ae7"} Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.270837 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fc6n" event={"ID":"0bf72de4-c628-4b28-9bfc-1d6a874a6575","Type":"ContainerStarted","Data":"490a87bc46222b454c81fd43d17a2db78d79721460c43c3075792562ac79cba6"} Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.278139 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn2sj" event={"ID":"daac685e-ee53-47ff-b3c4-7c999567e5cb","Type":"ContainerStarted","Data":"6b5688e7242e25472304483f57e0c7a070e57ca07927bbc59f38746e7bbfbfce"} Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.282482 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrmwh" event={"ID":"e2ab340c-642b-4f8d-bc1f-adebf5e79418","Type":"ContainerStarted","Data":"4e1cdcf8187f36276824b9ffa38843e940865f9cfa2fea7544fee4c009b76115"} Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.287685 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xnwjq" event={"ID":"a216af0b-0937-4319-97bc-9d180389b873","Type":"ContainerStarted","Data":"dc9484e866b5e4ab3c62825d5ffdd13102693c6e9c335a7f03303aa2ea181d16"} Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.290999 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5fp7" event={"ID":"81940df1-47e0-46fa-9215-5751defbf8c0","Type":"ContainerStarted","Data":"d9ae0b9ef202b84af9b66d4541629ac924c5576b57dfdcb3457898a8ab6199dc"} Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.295854 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt7zb" event={"ID":"7e36127f-1987-4067-bf22-38a1bc134721","Type":"ContainerStarted","Data":"1d49bfa392e9c2e6cccecdc0beb0eb5490a9ab33614a8f375f652f5507ab1010"} Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.315476 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zrmwh" podStartSLOduration=5.326959629 podStartE2EDuration="1m22.31545451s" podCreationTimestamp="2025-12-09 11:34:20 +0000 UTC" firstStartedPulling="2025-12-09 11:34:22.150143336 +0000 UTC m=+148.975344860" lastFinishedPulling="2025-12-09 11:35:39.138638217 +0000 UTC m=+225.963839741" observedRunningTime="2025-12-09 11:35:42.313405824 +0000 UTC m=+229.138607368" watchObservedRunningTime="2025-12-09 11:35:42.31545451 +0000 UTC m=+229.140656034" Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.315775 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2mvnf" podStartSLOduration=4.345340006 podStartE2EDuration="1m22.315768339s" podCreationTimestamp="2025-12-09 11:34:20 +0000 UTC" firstStartedPulling="2025-12-09 11:34:23.266027671 +0000 UTC m=+150.091229195" lastFinishedPulling="2025-12-09 11:35:41.236456014 +0000 UTC m=+228.061657528" observedRunningTime="2025-12-09 11:35:42.288100592 +0000 UTC m=+229.113302116" watchObservedRunningTime="2025-12-09 11:35:42.315768339 +0000 UTC m=+229.140969863" Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.404132 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cn2sj" podStartSLOduration=4.60523857 podStartE2EDuration="1m19.404114405s" podCreationTimestamp="2025-12-09 11:34:23 +0000 UTC" firstStartedPulling="2025-12-09 11:34:26.523676334 +0000 UTC m=+153.348877858" lastFinishedPulling="2025-12-09 11:35:41.322552169 +0000 UTC m=+228.147753693" observedRunningTime="2025-12-09 11:35:42.402310385 +0000 UTC m=+229.227511909" watchObservedRunningTime="2025-12-09 11:35:42.404114405 +0000 UTC m=+229.229315929" Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.422032 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xnwjq" podStartSLOduration=3.624701042 podStartE2EDuration="1m20.422015594s" podCreationTimestamp="2025-12-09 11:34:22 +0000 UTC" firstStartedPulling="2025-12-09 11:34:24.45443471 +0000 UTC m=+151.279636234" lastFinishedPulling="2025-12-09 11:35:41.251749272 +0000 UTC m=+228.076950786" observedRunningTime="2025-12-09 11:35:42.420739039 +0000 UTC m=+229.245940563" watchObservedRunningTime="2025-12-09 11:35:42.422015594 +0000 UTC m=+229.247217118" Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.473777 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s5fp7" podStartSLOduration=4.31985513 podStartE2EDuration="1m23.473750769s" podCreationTimestamp="2025-12-09 11:34:19 +0000 UTC" firstStartedPulling="2025-12-09 11:34:22.122345503 +0000 UTC m=+148.947547027" lastFinishedPulling="2025-12-09 11:35:41.276241142 +0000 UTC m=+228.101442666" observedRunningTime="2025-12-09 11:35:42.459476589 +0000 UTC m=+229.284678123" watchObservedRunningTime="2025-12-09 11:35:42.473750769 +0000 UTC m=+229.298952293" Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.568186 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:35:42 crc kubenswrapper[4745]: I1209 11:35:42.568496 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:35:43 crc kubenswrapper[4745]: I1209 11:35:43.303535 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgrls" event={"ID":"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05","Type":"ContainerStarted","Data":"335f829451ea9ad72ccf7dfb933d6e123fa34c49a709a1f97c338da78d7499e3"} Dec 09 11:35:43 crc kubenswrapper[4745]: I1209 11:35:43.306484 4745 generic.go:334] "Generic (PLEG): container finished" podID="7e36127f-1987-4067-bf22-38a1bc134721" containerID="1d49bfa392e9c2e6cccecdc0beb0eb5490a9ab33614a8f375f652f5507ab1010" exitCode=0 Dec 09 11:35:43 crc kubenswrapper[4745]: I1209 11:35:43.306955 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt7zb" event={"ID":"7e36127f-1987-4067-bf22-38a1bc134721","Type":"ContainerDied","Data":"1d49bfa392e9c2e6cccecdc0beb0eb5490a9ab33614a8f375f652f5507ab1010"} Dec 09 11:35:43 crc kubenswrapper[4745]: I1209 11:35:43.328884 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hgrls" podStartSLOduration=3.074972592 podStartE2EDuration="1m21.328864651s" podCreationTimestamp="2025-12-09 11:34:22 +0000 UTC" firstStartedPulling="2025-12-09 11:34:24.424656203 +0000 UTC m=+151.249857727" lastFinishedPulling="2025-12-09 11:35:42.678548262 +0000 UTC m=+229.503749786" observedRunningTime="2025-12-09 11:35:43.328565583 +0000 UTC m=+230.153767107" watchObservedRunningTime="2025-12-09 11:35:43.328864651 +0000 UTC m=+230.154066175" Dec 09 11:35:43 crc kubenswrapper[4745]: I1209 11:35:43.699548 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:35:43 crc kubenswrapper[4745]: I1209 11:35:43.700139 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:35:43 crc kubenswrapper[4745]: I1209 11:35:43.721561 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-xnwjq" podUID="a216af0b-0937-4319-97bc-9d180389b873" containerName="registry-server" probeResult="failure" output=< Dec 09 11:35:43 crc kubenswrapper[4745]: timeout: failed to connect service ":50051" within 1s Dec 09 11:35:43 crc kubenswrapper[4745]: > Dec 09 11:35:44 crc kubenswrapper[4745]: I1209 11:35:44.313901 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt7zb" event={"ID":"7e36127f-1987-4067-bf22-38a1bc134721","Type":"ContainerStarted","Data":"e4761aa888954734b8ee461d85a726bfec2867b0f73480d43d03b61ffb89b045"} Dec 09 11:35:44 crc kubenswrapper[4745]: I1209 11:35:44.315810 4745 generic.go:334] "Generic (PLEG): container finished" podID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" containerID="490a87bc46222b454c81fd43d17a2db78d79721460c43c3075792562ac79cba6" exitCode=0 Dec 09 11:35:44 crc kubenswrapper[4745]: I1209 11:35:44.315877 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fc6n" event={"ID":"0bf72de4-c628-4b28-9bfc-1d6a874a6575","Type":"ContainerDied","Data":"490a87bc46222b454c81fd43d17a2db78d79721460c43c3075792562ac79cba6"} Dec 09 11:35:44 crc kubenswrapper[4745]: I1209 11:35:44.335167 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zt7zb" podStartSLOduration=3.911085066 podStartE2EDuration="1m24.335145398s" podCreationTimestamp="2025-12-09 11:34:20 +0000 UTC" firstStartedPulling="2025-12-09 11:34:23.288005082 +0000 UTC m=+150.113206606" lastFinishedPulling="2025-12-09 11:35:43.712065414 +0000 UTC m=+230.537266938" observedRunningTime="2025-12-09 11:35:44.329985947 +0000 UTC m=+231.155187491" watchObservedRunningTime="2025-12-09 11:35:44.335145398 +0000 UTC m=+231.160346922" Dec 09 11:35:44 crc kubenswrapper[4745]: I1209 11:35:44.759216 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cn2sj" podUID="daac685e-ee53-47ff-b3c4-7c999567e5cb" containerName="registry-server" probeResult="failure" output=< Dec 09 11:35:44 crc kubenswrapper[4745]: timeout: failed to connect service ":50051" within 1s Dec 09 11:35:44 crc kubenswrapper[4745]: > Dec 09 11:35:45 crc kubenswrapper[4745]: I1209 11:35:45.326565 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fc6n" event={"ID":"0bf72de4-c628-4b28-9bfc-1d6a874a6575","Type":"ContainerStarted","Data":"dd7be5e860d00f7d877001d8a14ccc62b5c4eb22d9ddee584d6b6ce770bf661b"} Dec 09 11:35:45 crc kubenswrapper[4745]: I1209 11:35:45.346906 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5fc6n" podStartSLOduration=3.967591083 podStartE2EDuration="1m22.346885274s" podCreationTimestamp="2025-12-09 11:34:23 +0000 UTC" firstStartedPulling="2025-12-09 11:34:26.539243146 +0000 UTC m=+153.364444670" lastFinishedPulling="2025-12-09 11:35:44.918537337 +0000 UTC m=+231.743738861" observedRunningTime="2025-12-09 11:35:45.346261517 +0000 UTC m=+232.171463041" watchObservedRunningTime="2025-12-09 11:35:45.346885274 +0000 UTC m=+232.172086798" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.658102 4745 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 11:35:48 crc kubenswrapper[4745]: E1209 11:35:48.659432 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b661779-83ff-479f-9bfa-e2352640c734" containerName="pruner" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.659454 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b661779-83ff-479f-9bfa-e2352640c734" containerName="pruner" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.659598 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b661779-83ff-479f-9bfa-e2352640c734" containerName="pruner" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.660052 4745 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.660315 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.660460 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6" gracePeriod=15 Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.660472 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d" gracePeriod=15 Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.660789 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128" gracePeriod=15 Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.660865 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc" gracePeriod=15 Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.661151 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0" gracePeriod=15 Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.661349 4745 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 11:35:48 crc kubenswrapper[4745]: E1209 11:35:48.661604 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.661632 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 11:35:48 crc kubenswrapper[4745]: E1209 11:35:48.661648 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.661660 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 11:35:48 crc kubenswrapper[4745]: E1209 11:35:48.661677 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.661686 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 09 11:35:48 crc kubenswrapper[4745]: E1209 11:35:48.661706 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.661714 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 09 11:35:48 crc kubenswrapper[4745]: E1209 11:35:48.661726 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.661735 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 09 11:35:48 crc kubenswrapper[4745]: E1209 11:35:48.661746 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.661756 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 09 11:35:48 crc kubenswrapper[4745]: E1209 11:35:48.661770 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.661778 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.661911 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.661927 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.661946 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.661960 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.661970 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.661982 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.712699 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.807491 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.807567 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.807586 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.807628 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.807654 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.807801 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.807827 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.807872 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.909437 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.909631 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.909661 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.909689 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.909806 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.909830 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.909879 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.909908 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.910015 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.910045 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.910083 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.910118 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.910069 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.910156 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.910161 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:48 crc kubenswrapper[4745]: I1209 11:35:48.909582 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:49 crc kubenswrapper[4745]: I1209 11:35:49.007109 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:35:49 crc kubenswrapper[4745]: W1209 11:35:49.034881 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-da28065d8bfefad68d17a4907eb26ecd045e593422c2bed92601b0d9bd7c3afd WatchSource:0}: Error finding container da28065d8bfefad68d17a4907eb26ecd045e593422c2bed92601b0d9bd7c3afd: Status 404 returned error can't find the container with id da28065d8bfefad68d17a4907eb26ecd045e593422c2bed92601b0d9bd7c3afd Dec 09 11:35:49 crc kubenswrapper[4745]: I1209 11:35:49.224873 4745 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 09 11:35:49 crc kubenswrapper[4745]: I1209 11:35:49.224931 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 09 11:35:49 crc kubenswrapper[4745]: I1209 11:35:49.349587 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"da28065d8bfefad68d17a4907eb26ecd045e593422c2bed92601b0d9bd7c3afd"} Dec 09 11:35:49 crc kubenswrapper[4745]: I1209 11:35:49.409765 4745 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]log ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]api-openshift-apiserver-available ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]api-openshift-oauth-apiserver-available ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]informer-sync ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/generic-apiserver-start-informers ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/priority-and-fairness-filter ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/start-apiextensions-informers ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/start-apiextensions-controllers ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/crd-informer-synced ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/start-system-namespaces-controller ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/rbac/bootstrap-roles ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/bootstrap-controller ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/start-kube-aggregator-informers ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/apiservice-registration-controller ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/apiservice-discovery-controller ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]autoregister-completion ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/apiservice-openapi-controller ok Dec 09 11:35:49 crc kubenswrapper[4745]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 09 11:35:49 crc kubenswrapper[4745]: [-]shutdown failed: reason withheld Dec 09 11:35:49 crc kubenswrapper[4745]: readyz check failed Dec 09 11:35:49 crc kubenswrapper[4745]: I1209 11:35:49.409823 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 11:35:50 crc kubenswrapper[4745]: I1209 11:35:50.173747 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:35:50 crc kubenswrapper[4745]: I1209 11:35:50.174612 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:35:50 crc kubenswrapper[4745]: I1209 11:35:50.437730 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:35:50 crc kubenswrapper[4745]: I1209 11:35:50.438004 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:35:50 crc kubenswrapper[4745]: I1209 11:35:50.477323 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:35:50 crc kubenswrapper[4745]: I1209 11:35:50.479998 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:35:50 crc kubenswrapper[4745]: I1209 11:35:50.666532 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:35:50 crc kubenswrapper[4745]: I1209 11:35:50.666584 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:35:50 crc kubenswrapper[4745]: I1209 11:35:50.718006 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:35:50 crc kubenswrapper[4745]: I1209 11:35:50.783721 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:35:50 crc kubenswrapper[4745]: I1209 11:35:50.783767 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:35:50 crc kubenswrapper[4745]: I1209 11:35:50.826305 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:35:51 crc kubenswrapper[4745]: I1209 11:35:51.373662 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 09 11:35:51 crc kubenswrapper[4745]: I1209 11:35:51.377242 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 11:35:51 crc kubenswrapper[4745]: I1209 11:35:51.378284 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6" exitCode=0 Dec 09 11:35:51 crc kubenswrapper[4745]: I1209 11:35:51.378317 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128" exitCode=0 Dec 09 11:35:51 crc kubenswrapper[4745]: I1209 11:35:51.378327 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc" exitCode=2 Dec 09 11:35:51 crc kubenswrapper[4745]: I1209 11:35:51.418173 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:35:51 crc kubenswrapper[4745]: I1209 11:35:51.425595 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:35:51 crc kubenswrapper[4745]: I1209 11:35:51.426070 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:35:51 crc kubenswrapper[4745]: I1209 11:35:51.429597 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:35:51 crc kubenswrapper[4745]: E1209 11:35:51.482770 4745 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.45:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187f88f1e611ee53 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Created,Message:Created container startup-monitor,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 11:35:51.480954451 +0000 UTC m=+238.306155975,LastTimestamp:2025-12-09 11:35:51.480954451 +0000 UTC m=+238.306155975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 11:35:51 crc kubenswrapper[4745]: E1209 11:35:51.581479 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:35:51Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:35:51Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:35:51Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:35:51Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:63101a04e548d90b44dd9a102f0e3e446b6168bfeab9cb54b681265982f4da50\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f4f7e8636484dd6cf3173aac295f2fa1fdb8abce8310f4017acf6b575727e32c\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1623994699},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:90a4e5cb660cb2430628f4f9d6f8772fc21f360abc05255a81f4a0262b1b54f4\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:aae58f58c8613b1c3b98355a534fbc804422c982b416204cf9aad7e9a92ac27d\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1204478801},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:a2149679a3b0827822ac22129efa300ab25d90a557f8961dbcfdda3f55222fca\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:f52ab4781f6f00120bb4d7222688ce6f64d4ae75218b6cc36cbf9a14ad4d4060\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201829714},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1b1026c62413fa239fa4ff6541fe8bda656c1281867ad6ee2c848feccb13c97e\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:2b633ebdc901d19290af4dc2d09e2b59c504c0fc15a3fba410b0ce098e2d5753\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1141987142},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:51 crc kubenswrapper[4745]: E1209 11:35:51.582157 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:51 crc kubenswrapper[4745]: E1209 11:35:51.582643 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:51 crc kubenswrapper[4745]: E1209 11:35:51.583031 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:51 crc kubenswrapper[4745]: E1209 11:35:51.583623 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:51 crc kubenswrapper[4745]: E1209 11:35:51.583643 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 11:35:52 crc kubenswrapper[4745]: E1209 11:35:52.333018 4745 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.45:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187f88f1e611ee53 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Created,Message:Created container startup-monitor,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 11:35:51.480954451 +0000 UTC m=+238.306155975,LastTimestamp:2025-12-09 11:35:51.480954451 +0000 UTC m=+238.306155975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 11:35:52 crc kubenswrapper[4745]: I1209 11:35:52.386799 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 09 11:35:52 crc kubenswrapper[4745]: I1209 11:35:52.388044 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 11:35:52 crc kubenswrapper[4745]: I1209 11:35:52.388689 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d" exitCode=0 Dec 09 11:35:52 crc kubenswrapper[4745]: I1209 11:35:52.388789 4745 scope.go:117] "RemoveContainer" containerID="580e328dcda89fa13957ab69d0e67bfbb5739bcfaa0d8f99c19df7e9c19d59db" Dec 09 11:35:52 crc kubenswrapper[4745]: I1209 11:35:52.390767 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"40905afca81ef5df06ab4dd5547a4da1fac05889fa1f0d8d234afc59ce3cae1a"} Dec 09 11:35:52 crc kubenswrapper[4745]: I1209 11:35:52.394118 4745 generic.go:334] "Generic (PLEG): container finished" podID="d48d4949-4889-4b14-95bd-37954bd71417" containerID="d75f4b0403a8f0ee2c7be1e1e4d80247e74b6fa931940face5028f370a753761" exitCode=0 Dec 09 11:35:52 crc kubenswrapper[4745]: I1209 11:35:52.394262 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d48d4949-4889-4b14-95bd-37954bd71417","Type":"ContainerDied","Data":"d75f4b0403a8f0ee2c7be1e1e4d80247e74b6fa931940face5028f370a753761"} Dec 09 11:35:52 crc kubenswrapper[4745]: I1209 11:35:52.639674 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:35:52 crc kubenswrapper[4745]: I1209 11:35:52.682367 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:35:52 crc kubenswrapper[4745]: I1209 11:35:52.803259 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:35:52 crc kubenswrapper[4745]: I1209 11:35:52.803306 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:35:52 crc kubenswrapper[4745]: I1209 11:35:52.842763 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:35:53 crc kubenswrapper[4745]: I1209 11:35:53.451313 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:35:53 crc kubenswrapper[4745]: I1209 11:35:53.717769 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 11:35:53 crc kubenswrapper[4745]: I1209 11:35:53.746106 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:35:53 crc kubenswrapper[4745]: I1209 11:35:53.790762 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:35:53 crc kubenswrapper[4745]: I1209 11:35:53.840703 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:35:53 crc kubenswrapper[4745]: I1209 11:35:53.840751 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:35:53 crc kubenswrapper[4745]: I1209 11:35:53.881360 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d48d4949-4889-4b14-95bd-37954bd71417-kube-api-access\") pod \"d48d4949-4889-4b14-95bd-37954bd71417\" (UID: \"d48d4949-4889-4b14-95bd-37954bd71417\") " Dec 09 11:35:53 crc kubenswrapper[4745]: I1209 11:35:53.881920 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d48d4949-4889-4b14-95bd-37954bd71417-kubelet-dir\") pod \"d48d4949-4889-4b14-95bd-37954bd71417\" (UID: \"d48d4949-4889-4b14-95bd-37954bd71417\") " Dec 09 11:35:53 crc kubenswrapper[4745]: I1209 11:35:53.882188 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d48d4949-4889-4b14-95bd-37954bd71417-var-lock\") pod \"d48d4949-4889-4b14-95bd-37954bd71417\" (UID: \"d48d4949-4889-4b14-95bd-37954bd71417\") " Dec 09 11:35:53 crc kubenswrapper[4745]: I1209 11:35:53.882046 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d48d4949-4889-4b14-95bd-37954bd71417-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d48d4949-4889-4b14-95bd-37954bd71417" (UID: "d48d4949-4889-4b14-95bd-37954bd71417"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:35:53 crc kubenswrapper[4745]: I1209 11:35:53.882289 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d48d4949-4889-4b14-95bd-37954bd71417-var-lock" (OuterVolumeSpecName: "var-lock") pod "d48d4949-4889-4b14-95bd-37954bd71417" (UID: "d48d4949-4889-4b14-95bd-37954bd71417"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:35:53 crc kubenswrapper[4745]: I1209 11:35:53.883065 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:35:53 crc kubenswrapper[4745]: I1209 11:35:53.883718 4745 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d48d4949-4889-4b14-95bd-37954bd71417-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:53 crc kubenswrapper[4745]: I1209 11:35:53.883817 4745 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d48d4949-4889-4b14-95bd-37954bd71417-var-lock\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:53 crc kubenswrapper[4745]: I1209 11:35:53.899838 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d48d4949-4889-4b14-95bd-37954bd71417-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d48d4949-4889-4b14-95bd-37954bd71417" (UID: "d48d4949-4889-4b14-95bd-37954bd71417"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:35:53 crc kubenswrapper[4745]: I1209 11:35:53.984552 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d48d4949-4889-4b14-95bd-37954bd71417-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.410932 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d48d4949-4889-4b14-95bd-37954bd71417","Type":"ContainerDied","Data":"c143a392e02c6827a8e335955ef44a742dd40222d18e5ac4c3688154edc0d382"} Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.410994 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c143a392e02c6827a8e335955ef44a742dd40222d18e5ac4c3688154edc0d382" Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.410995 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.415596 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.416529 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0" exitCode=0 Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.460232 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.726683 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.727526 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.896403 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.896582 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.897109 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.897144 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.897250 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.897327 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.897695 4745 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.897723 4745 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:54 crc kubenswrapper[4745]: I1209 11:35:54.897737 4745 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.431260 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.432845 4745 scope.go:117] "RemoveContainer" containerID="c8fff837a5cebd826d69b5bb22a3f86c05e47da3335fb5b6deaf531bd9271a2d" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.433019 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.455239 4745 scope.go:117] "RemoveContainer" containerID="2f3ce15012ca2753ab34af80a0f3808a3b42d053fecd9ffa7e7c9ccdb238c6e6" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.477424 4745 scope.go:117] "RemoveContainer" containerID="7bb61dbd5e52651f4152e6d4181191805eb7ae73aac6433fabccddae4e138128" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.480053 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.480684 4745 status_manager.go:851] "Failed to get status for pod" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" pod="openshift-marketplace/community-operators-s5fp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s5fp7\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.486965 4745 status_manager.go:851] "Failed to get status for pod" podUID="7e36127f-1987-4067-bf22-38a1bc134721" pod="openshift-marketplace/community-operators-zt7zb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zt7zb\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.487794 4745 status_manager.go:851] "Failed to get status for pod" podUID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" pod="openshift-marketplace/certified-operators-zrmwh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zrmwh\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.488194 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.488626 4745 status_manager.go:851] "Failed to get status for pod" podUID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" pod="openshift-marketplace/redhat-operators-5fc6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5fc6n\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.489060 4745 status_manager.go:851] "Failed to get status for pod" podUID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" pod="openshift-marketplace/redhat-marketplace-hgrls" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hgrls\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.489573 4745 status_manager.go:851] "Failed to get status for pod" podUID="d48d4949-4889-4b14-95bd-37954bd71417" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.489901 4745 status_manager.go:851] "Failed to get status for pod" podUID="daac685e-ee53-47ff-b3c4-7c999567e5cb" pod="openshift-marketplace/redhat-operators-cn2sj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn2sj\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.490289 4745 status_manager.go:851] "Failed to get status for pod" podUID="b1305324-76f0-474c-8933-599d9b6eaff4" pod="openshift-marketplace/certified-operators-2mvnf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2mvnf\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.490780 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.491121 4745 status_manager.go:851] "Failed to get status for pod" podUID="a216af0b-0937-4319-97bc-9d180389b873" pod="openshift-marketplace/redhat-marketplace-xnwjq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xnwjq\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.491592 4745 status_manager.go:851] "Failed to get status for pod" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" pod="openshift-marketplace/community-operators-s5fp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s5fp7\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.492335 4745 status_manager.go:851] "Failed to get status for pod" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" pod="openshift-marketplace/community-operators-s5fp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s5fp7\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.493461 4745 status_manager.go:851] "Failed to get status for pod" podUID="7e36127f-1987-4067-bf22-38a1bc134721" pod="openshift-marketplace/community-operators-zt7zb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zt7zb\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.493913 4745 status_manager.go:851] "Failed to get status for pod" podUID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" pod="openshift-marketplace/certified-operators-zrmwh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zrmwh\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.494274 4745 status_manager.go:851] "Failed to get status for pod" podUID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" pod="openshift-marketplace/redhat-operators-5fc6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5fc6n\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.494619 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.495139 4745 status_manager.go:851] "Failed to get status for pod" podUID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" pod="openshift-marketplace/redhat-marketplace-hgrls" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hgrls\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.495426 4745 status_manager.go:851] "Failed to get status for pod" podUID="d48d4949-4889-4b14-95bd-37954bd71417" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.495716 4745 status_manager.go:851] "Failed to get status for pod" podUID="daac685e-ee53-47ff-b3c4-7c999567e5cb" pod="openshift-marketplace/redhat-operators-cn2sj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn2sj\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.496136 4745 status_manager.go:851] "Failed to get status for pod" podUID="b1305324-76f0-474c-8933-599d9b6eaff4" pod="openshift-marketplace/certified-operators-2mvnf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2mvnf\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.496420 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.496740 4745 status_manager.go:851] "Failed to get status for pod" podUID="a216af0b-0937-4319-97bc-9d180389b873" pod="openshift-marketplace/redhat-marketplace-xnwjq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xnwjq\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.509343 4745 scope.go:117] "RemoveContainer" containerID="c66d3437c4c16e920a7f3a1997aaa94f980c1ebc5cb036506824d6dda32858dc" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.529563 4745 scope.go:117] "RemoveContainer" containerID="df00c9ad3e08ffda11c5af257c877797854cb541a86ad689e630871951677fb0" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.550477 4745 scope.go:117] "RemoveContainer" containerID="046dd28eb789f92d6b0086f8ff8ac7ec4390b4402d55a26634f059b580d6f4b3" Dec 09 11:35:55 crc kubenswrapper[4745]: I1209 11:35:55.565628 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 09 11:35:57 crc kubenswrapper[4745]: E1209 11:35:57.649898 4745 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:57 crc kubenswrapper[4745]: E1209 11:35:57.651168 4745 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:57 crc kubenswrapper[4745]: E1209 11:35:57.651579 4745 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:57 crc kubenswrapper[4745]: E1209 11:35:57.651986 4745 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:57 crc kubenswrapper[4745]: E1209 11:35:57.652292 4745 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:57 crc kubenswrapper[4745]: I1209 11:35:57.652329 4745 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 09 11:35:57 crc kubenswrapper[4745]: E1209 11:35:57.652663 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" interval="200ms" Dec 09 11:35:57 crc kubenswrapper[4745]: E1209 11:35:57.854639 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" interval="400ms" Dec 09 11:35:58 crc kubenswrapper[4745]: E1209 11:35:58.256459 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" interval="800ms" Dec 09 11:35:59 crc kubenswrapper[4745]: E1209 11:35:59.057785 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" interval="1.6s" Dec 09 11:35:59 crc kubenswrapper[4745]: I1209 11:35:59.554954 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:35:59 crc kubenswrapper[4745]: I1209 11:35:59.556037 4745 status_manager.go:851] "Failed to get status for pod" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" pod="openshift-marketplace/community-operators-s5fp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s5fp7\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:59 crc kubenswrapper[4745]: I1209 11:35:59.556425 4745 status_manager.go:851] "Failed to get status for pod" podUID="7e36127f-1987-4067-bf22-38a1bc134721" pod="openshift-marketplace/community-operators-zt7zb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zt7zb\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:59 crc kubenswrapper[4745]: I1209 11:35:59.556846 4745 status_manager.go:851] "Failed to get status for pod" podUID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" pod="openshift-marketplace/certified-operators-zrmwh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zrmwh\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:59 crc kubenswrapper[4745]: I1209 11:35:59.557211 4745 status_manager.go:851] "Failed to get status for pod" podUID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" pod="openshift-marketplace/redhat-operators-5fc6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5fc6n\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:59 crc kubenswrapper[4745]: I1209 11:35:59.557445 4745 status_manager.go:851] "Failed to get status for pod" podUID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" pod="openshift-marketplace/redhat-marketplace-hgrls" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hgrls\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:59 crc kubenswrapper[4745]: I1209 11:35:59.557697 4745 status_manager.go:851] "Failed to get status for pod" podUID="d48d4949-4889-4b14-95bd-37954bd71417" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:59 crc kubenswrapper[4745]: I1209 11:35:59.557969 4745 status_manager.go:851] "Failed to get status for pod" podUID="daac685e-ee53-47ff-b3c4-7c999567e5cb" pod="openshift-marketplace/redhat-operators-cn2sj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn2sj\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:59 crc kubenswrapper[4745]: I1209 11:35:59.558155 4745 status_manager.go:851] "Failed to get status for pod" podUID="b1305324-76f0-474c-8933-599d9b6eaff4" pod="openshift-marketplace/certified-operators-2mvnf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2mvnf\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:59 crc kubenswrapper[4745]: I1209 11:35:59.558442 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:59 crc kubenswrapper[4745]: I1209 11:35:59.558707 4745 status_manager.go:851] "Failed to get status for pod" podUID="a216af0b-0937-4319-97bc-9d180389b873" pod="openshift-marketplace/redhat-marketplace-xnwjq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xnwjq\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:35:59 crc kubenswrapper[4745]: I1209 11:35:59.569476 4745 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fa0e95e9-72b9-4928-bb37-a6761c77500c" Dec 09 11:35:59 crc kubenswrapper[4745]: I1209 11:35:59.569522 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fa0e95e9-72b9-4928-bb37-a6761c77500c" Dec 09 11:35:59 crc kubenswrapper[4745]: E1209 11:35:59.570025 4745 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.45:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:35:59 crc kubenswrapper[4745]: I1209 11:35:59.570833 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:36:00 crc kubenswrapper[4745]: I1209 11:36:00.464106 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6436b6d51a333ef2fa93105a548c9687876d3cf0105426e418ffa317bcb2abc1"} Dec 09 11:36:00 crc kubenswrapper[4745]: E1209 11:36:00.659367 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" interval="3.2s" Dec 09 11:36:01 crc kubenswrapper[4745]: I1209 11:36:01.472040 4745 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="76817fa43f87cba5afe7893e389339f14a41140359699a5b9823759b69188f2d" exitCode=0 Dec 09 11:36:01 crc kubenswrapper[4745]: I1209 11:36:01.472442 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"76817fa43f87cba5afe7893e389339f14a41140359699a5b9823759b69188f2d"} Dec 09 11:36:01 crc kubenswrapper[4745]: I1209 11:36:01.473006 4745 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fa0e95e9-72b9-4928-bb37-a6761c77500c" Dec 09 11:36:01 crc kubenswrapper[4745]: I1209 11:36:01.473046 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fa0e95e9-72b9-4928-bb37-a6761c77500c" Dec 09 11:36:01 crc kubenswrapper[4745]: I1209 11:36:01.473442 4745 status_manager.go:851] "Failed to get status for pod" podUID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" pod="openshift-marketplace/redhat-marketplace-hgrls" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hgrls\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:36:01 crc kubenswrapper[4745]: E1209 11:36:01.473583 4745 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.45:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:36:01 crc kubenswrapper[4745]: I1209 11:36:01.473822 4745 status_manager.go:851] "Failed to get status for pod" podUID="d48d4949-4889-4b14-95bd-37954bd71417" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:36:01 crc kubenswrapper[4745]: I1209 11:36:01.474166 4745 status_manager.go:851] "Failed to get status for pod" podUID="daac685e-ee53-47ff-b3c4-7c999567e5cb" pod="openshift-marketplace/redhat-operators-cn2sj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn2sj\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:36:01 crc kubenswrapper[4745]: I1209 11:36:01.474459 4745 status_manager.go:851] "Failed to get status for pod" podUID="b1305324-76f0-474c-8933-599d9b6eaff4" pod="openshift-marketplace/certified-operators-2mvnf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2mvnf\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:36:01 crc kubenswrapper[4745]: I1209 11:36:01.475131 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:36:01 crc kubenswrapper[4745]: I1209 11:36:01.475469 4745 status_manager.go:851] "Failed to get status for pod" podUID="a216af0b-0937-4319-97bc-9d180389b873" pod="openshift-marketplace/redhat-marketplace-xnwjq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xnwjq\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:36:01 crc kubenswrapper[4745]: I1209 11:36:01.476031 4745 status_manager.go:851] "Failed to get status for pod" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" pod="openshift-marketplace/community-operators-s5fp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s5fp7\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:36:01 crc kubenswrapper[4745]: I1209 11:36:01.476379 4745 status_manager.go:851] "Failed to get status for pod" podUID="7e36127f-1987-4067-bf22-38a1bc134721" pod="openshift-marketplace/community-operators-zt7zb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zt7zb\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:36:01 crc kubenswrapper[4745]: I1209 11:36:01.476770 4745 status_manager.go:851] "Failed to get status for pod" podUID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" pod="openshift-marketplace/certified-operators-zrmwh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zrmwh\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:36:01 crc kubenswrapper[4745]: I1209 11:36:01.477045 4745 status_manager.go:851] "Failed to get status for pod" podUID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" pod="openshift-marketplace/redhat-operators-5fc6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5fc6n\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:36:01 crc kubenswrapper[4745]: E1209 11:36:01.708571 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:36:01Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:36:01Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:36:01Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T11:36:01Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:63101a04e548d90b44dd9a102f0e3e446b6168bfeab9cb54b681265982f4da50\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f4f7e8636484dd6cf3173aac295f2fa1fdb8abce8310f4017acf6b575727e32c\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1623994699},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:90a4e5cb660cb2430628f4f9d6f8772fc21f360abc05255a81f4a0262b1b54f4\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:aae58f58c8613b1c3b98355a534fbc804422c982b416204cf9aad7e9a92ac27d\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1204478801},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:a2149679a3b0827822ac22129efa300ab25d90a557f8961dbcfdda3f55222fca\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:f52ab4781f6f00120bb4d7222688ce6f64d4ae75218b6cc36cbf9a14ad4d4060\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201829714},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1b1026c62413fa239fa4ff6541fe8bda656c1281867ad6ee2c848feccb13c97e\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:2b633ebdc901d19290af4dc2d09e2b59c504c0fc15a3fba410b0ce098e2d5753\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1141987142},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:36:01 crc kubenswrapper[4745]: E1209 11:36:01.709474 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:36:01 crc kubenswrapper[4745]: E1209 11:36:01.709802 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:36:01 crc kubenswrapper[4745]: E1209 11:36:01.709993 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:36:01 crc kubenswrapper[4745]: E1209 11:36:01.710187 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" Dec 09 11:36:01 crc kubenswrapper[4745]: E1209 11:36:01.710208 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 11:36:02 crc kubenswrapper[4745]: I1209 11:36:02.488979 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a7fbbacc30c96299b44052d9b0ef2aff255ec95cf934b71efc0c5147b0694321"} Dec 09 11:36:02 crc kubenswrapper[4745]: I1209 11:36:02.489025 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2a6ae87ed233615197eaf9cc5021d1d986bbeb21063c7d30469228ea6c78a002"} Dec 09 11:36:02 crc kubenswrapper[4745]: I1209 11:36:02.489040 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"707f3249c62e0217e4366620526b52f685da6a7cf7c0b3484e1cd36a756d8045"} Dec 09 11:36:03 crc kubenswrapper[4745]: I1209 11:36:03.501819 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e7ef063a3dd84b166df8dd6803813e3c50ebac968f6270e5cd4d2f74211baa09"} Dec 09 11:36:04 crc kubenswrapper[4745]: I1209 11:36:04.512842 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5f2b7474be253455232ebeb1fd8feb735f2ea96f34a52eee289240750e0a2f45"} Dec 09 11:36:04 crc kubenswrapper[4745]: I1209 11:36:04.513119 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:36:04 crc kubenswrapper[4745]: I1209 11:36:04.513337 4745 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fa0e95e9-72b9-4928-bb37-a6761c77500c" Dec 09 11:36:04 crc kubenswrapper[4745]: I1209 11:36:04.513390 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fa0e95e9-72b9-4928-bb37-a6761c77500c" Dec 09 11:36:04 crc kubenswrapper[4745]: I1209 11:36:04.571117 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:36:04 crc kubenswrapper[4745]: I1209 11:36:04.571241 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:36:04 crc kubenswrapper[4745]: I1209 11:36:04.576856 4745 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]log ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]etcd ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/generic-apiserver-start-informers ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/priority-and-fairness-filter ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/start-apiextensions-informers ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/start-apiextensions-controllers ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/crd-informer-synced ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/start-system-namespaces-controller ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 09 11:36:04 crc kubenswrapper[4745]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/bootstrap-controller ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/start-kube-aggregator-informers ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/apiservice-registration-controller ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/apiservice-discovery-controller ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]autoregister-completion ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/apiservice-openapi-controller ok Dec 09 11:36:04 crc kubenswrapper[4745]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 09 11:36:04 crc kubenswrapper[4745]: livez check failed Dec 09 11:36:04 crc kubenswrapper[4745]: I1209 11:36:04.578543 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 11:36:05 crc kubenswrapper[4745]: I1209 11:36:05.326038 4745 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 09 11:36:05 crc kubenswrapper[4745]: I1209 11:36:05.327274 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 09 11:36:05 crc kubenswrapper[4745]: I1209 11:36:05.526067 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 09 11:36:05 crc kubenswrapper[4745]: I1209 11:36:05.526129 4745 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365" exitCode=1 Dec 09 11:36:05 crc kubenswrapper[4745]: I1209 11:36:05.526173 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365"} Dec 09 11:36:05 crc kubenswrapper[4745]: I1209 11:36:05.526962 4745 scope.go:117] "RemoveContainer" containerID="5c8d7378cae63aa9570756b9e0424148d347ecc98ab4648549864a8121607365" Dec 09 11:36:07 crc kubenswrapper[4745]: I1209 11:36:07.544353 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 09 11:36:07 crc kubenswrapper[4745]: I1209 11:36:07.544762 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f61fe639f3cf03aab3ba170199eca34ab5bdf16d4dd8f6e962f492043128525f"} Dec 09 11:36:09 crc kubenswrapper[4745]: I1209 11:36:09.530107 4745 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:36:09 crc kubenswrapper[4745]: I1209 11:36:09.577863 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:36:10 crc kubenswrapper[4745]: I1209 11:36:10.111850 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:36:10 crc kubenswrapper[4745]: I1209 11:36:10.117428 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:36:10 crc kubenswrapper[4745]: I1209 11:36:10.121732 4745 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7b893ecf-dae9-4710-beba-39fd61022eaa" Dec 09 11:36:10 crc kubenswrapper[4745]: I1209 11:36:10.559178 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:36:10 crc kubenswrapper[4745]: I1209 11:36:10.559348 4745 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fa0e95e9-72b9-4928-bb37-a6761c77500c" Dec 09 11:36:10 crc kubenswrapper[4745]: I1209 11:36:10.559425 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fa0e95e9-72b9-4928-bb37-a6761c77500c" Dec 09 11:36:10 crc kubenswrapper[4745]: I1209 11:36:10.568456 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:36:11 crc kubenswrapper[4745]: I1209 11:36:11.563663 4745 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fa0e95e9-72b9-4928-bb37-a6761c77500c" Dec 09 11:36:11 crc kubenswrapper[4745]: I1209 11:36:11.563700 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fa0e95e9-72b9-4928-bb37-a6761c77500c" Dec 09 11:36:13 crc kubenswrapper[4745]: I1209 11:36:13.570913 4745 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7b893ecf-dae9-4710-beba-39fd61022eaa" Dec 09 11:36:14 crc kubenswrapper[4745]: I1209 11:36:14.857161 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 09 11:36:14 crc kubenswrapper[4745]: I1209 11:36:14.892603 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 09 11:36:15 crc kubenswrapper[4745]: I1209 11:36:15.049642 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 09 11:36:15 crc kubenswrapper[4745]: I1209 11:36:15.120858 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 09 11:36:15 crc kubenswrapper[4745]: I1209 11:36:15.438410 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 09 11:36:15 crc kubenswrapper[4745]: I1209 11:36:15.606177 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 09 11:36:15 crc kubenswrapper[4745]: I1209 11:36:15.742592 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 09 11:36:15 crc kubenswrapper[4745]: I1209 11:36:15.875108 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 09 11:36:15 crc kubenswrapper[4745]: I1209 11:36:15.898916 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 09 11:36:15 crc kubenswrapper[4745]: I1209 11:36:15.926230 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 09 11:36:16 crc kubenswrapper[4745]: I1209 11:36:16.230449 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 09 11:36:16 crc kubenswrapper[4745]: I1209 11:36:16.274182 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 09 11:36:16 crc kubenswrapper[4745]: I1209 11:36:16.353887 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 09 11:36:16 crc kubenswrapper[4745]: I1209 11:36:16.431072 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 09 11:36:16 crc kubenswrapper[4745]: I1209 11:36:16.573430 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 09 11:36:16 crc kubenswrapper[4745]: I1209 11:36:16.634426 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 09 11:36:16 crc kubenswrapper[4745]: I1209 11:36:16.792751 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 09 11:36:16 crc kubenswrapper[4745]: I1209 11:36:16.819639 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 09 11:36:16 crc kubenswrapper[4745]: I1209 11:36:16.838479 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 09 11:36:16 crc kubenswrapper[4745]: I1209 11:36:16.904262 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 09 11:36:17 crc kubenswrapper[4745]: I1209 11:36:17.142616 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 09 11:36:17 crc kubenswrapper[4745]: I1209 11:36:17.186598 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 09 11:36:17 crc kubenswrapper[4745]: I1209 11:36:17.217312 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 09 11:36:17 crc kubenswrapper[4745]: I1209 11:36:17.527993 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 09 11:36:17 crc kubenswrapper[4745]: I1209 11:36:17.636180 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 09 11:36:17 crc kubenswrapper[4745]: I1209 11:36:17.725107 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 09 11:36:17 crc kubenswrapper[4745]: I1209 11:36:17.776658 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 09 11:36:17 crc kubenswrapper[4745]: I1209 11:36:17.783961 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 09 11:36:17 crc kubenswrapper[4745]: I1209 11:36:17.787932 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 09 11:36:17 crc kubenswrapper[4745]: I1209 11:36:17.790669 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 09 11:36:17 crc kubenswrapper[4745]: I1209 11:36:17.797485 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 09 11:36:17 crc kubenswrapper[4745]: I1209 11:36:17.988019 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 09 11:36:18 crc kubenswrapper[4745]: I1209 11:36:18.076811 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 09 11:36:18 crc kubenswrapper[4745]: I1209 11:36:18.249094 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 09 11:36:18 crc kubenswrapper[4745]: I1209 11:36:18.311961 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 09 11:36:18 crc kubenswrapper[4745]: I1209 11:36:18.379108 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 09 11:36:18 crc kubenswrapper[4745]: I1209 11:36:18.454089 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 09 11:36:18 crc kubenswrapper[4745]: I1209 11:36:18.752965 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 09 11:36:18 crc kubenswrapper[4745]: I1209 11:36:18.820545 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 09 11:36:18 crc kubenswrapper[4745]: I1209 11:36:18.846202 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 09 11:36:18 crc kubenswrapper[4745]: I1209 11:36:18.887538 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 09 11:36:18 crc kubenswrapper[4745]: I1209 11:36:18.920747 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 09 11:36:19 crc kubenswrapper[4745]: I1209 11:36:19.108743 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 09 11:36:19 crc kubenswrapper[4745]: I1209 11:36:19.260605 4745 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 09 11:36:19 crc kubenswrapper[4745]: I1209 11:36:19.492273 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 09 11:36:19 crc kubenswrapper[4745]: I1209 11:36:19.577707 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 11:36:19 crc kubenswrapper[4745]: I1209 11:36:19.594751 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 09 11:36:19 crc kubenswrapper[4745]: I1209 11:36:19.634307 4745 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 09 11:36:19 crc kubenswrapper[4745]: I1209 11:36:19.851303 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 09 11:36:19 crc kubenswrapper[4745]: I1209 11:36:19.940369 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 09 11:36:19 crc kubenswrapper[4745]: I1209 11:36:19.964495 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 09 11:36:20 crc kubenswrapper[4745]: I1209 11:36:20.071334 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 09 11:36:20 crc kubenswrapper[4745]: I1209 11:36:20.331040 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 09 11:36:20 crc kubenswrapper[4745]: I1209 11:36:20.340564 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 09 11:36:20 crc kubenswrapper[4745]: I1209 11:36:20.371196 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 09 11:36:20 crc kubenswrapper[4745]: I1209 11:36:20.455219 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 09 11:36:20 crc kubenswrapper[4745]: I1209 11:36:20.538633 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 09 11:36:20 crc kubenswrapper[4745]: I1209 11:36:20.676181 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 09 11:36:20 crc kubenswrapper[4745]: I1209 11:36:20.741814 4745 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 09 11:36:20 crc kubenswrapper[4745]: I1209 11:36:20.753648 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 09 11:36:20 crc kubenswrapper[4745]: I1209 11:36:20.918647 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 09 11:36:21 crc kubenswrapper[4745]: I1209 11:36:21.047936 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 09 11:36:21 crc kubenswrapper[4745]: I1209 11:36:21.072702 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 09 11:36:21 crc kubenswrapper[4745]: I1209 11:36:21.429097 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 09 11:36:21 crc kubenswrapper[4745]: I1209 11:36:21.641125 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 09 11:36:21 crc kubenswrapper[4745]: I1209 11:36:21.775958 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 09 11:36:22 crc kubenswrapper[4745]: I1209 11:36:22.046407 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 09 11:36:22 crc kubenswrapper[4745]: I1209 11:36:22.060002 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 09 11:36:22 crc kubenswrapper[4745]: I1209 11:36:22.180995 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 09 11:36:22 crc kubenswrapper[4745]: I1209 11:36:22.451070 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 09 11:36:22 crc kubenswrapper[4745]: I1209 11:36:22.586801 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 09 11:36:22 crc kubenswrapper[4745]: I1209 11:36:22.621096 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 09 11:36:22 crc kubenswrapper[4745]: I1209 11:36:22.820972 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 09 11:36:22 crc kubenswrapper[4745]: I1209 11:36:22.845602 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 09 11:36:22 crc kubenswrapper[4745]: I1209 11:36:22.846364 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 09 11:36:22 crc kubenswrapper[4745]: I1209 11:36:22.850469 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 09 11:36:22 crc kubenswrapper[4745]: I1209 11:36:22.888204 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 09 11:36:22 crc kubenswrapper[4745]: I1209 11:36:22.947601 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 09 11:36:22 crc kubenswrapper[4745]: I1209 11:36:22.996858 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 09 11:36:23 crc kubenswrapper[4745]: I1209 11:36:23.135704 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 11:36:23 crc kubenswrapper[4745]: I1209 11:36:23.364369 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 09 11:36:23 crc kubenswrapper[4745]: I1209 11:36:23.576416 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 09 11:36:23 crc kubenswrapper[4745]: I1209 11:36:23.847433 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 09 11:36:23 crc kubenswrapper[4745]: I1209 11:36:23.943794 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 09 11:36:23 crc kubenswrapper[4745]: I1209 11:36:23.963770 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 09 11:36:24 crc kubenswrapper[4745]: I1209 11:36:24.047922 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 09 11:36:24 crc kubenswrapper[4745]: I1209 11:36:24.259962 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 09 11:36:24 crc kubenswrapper[4745]: I1209 11:36:24.387075 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 09 11:36:24 crc kubenswrapper[4745]: I1209 11:36:24.589256 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 09 11:36:24 crc kubenswrapper[4745]: I1209 11:36:24.601947 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 09 11:36:24 crc kubenswrapper[4745]: I1209 11:36:24.629824 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 09 11:36:24 crc kubenswrapper[4745]: I1209 11:36:24.908620 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 09 11:36:24 crc kubenswrapper[4745]: I1209 11:36:24.976068 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 11:36:25 crc kubenswrapper[4745]: I1209 11:36:25.087858 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 09 11:36:25 crc kubenswrapper[4745]: I1209 11:36:25.497120 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 09 11:36:25 crc kubenswrapper[4745]: I1209 11:36:25.522160 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 09 11:36:25 crc kubenswrapper[4745]: I1209 11:36:25.557226 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 09 11:36:25 crc kubenswrapper[4745]: I1209 11:36:25.594404 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 09 11:36:25 crc kubenswrapper[4745]: I1209 11:36:25.686180 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 09 11:36:25 crc kubenswrapper[4745]: I1209 11:36:25.690810 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 09 11:36:25 crc kubenswrapper[4745]: I1209 11:36:25.736032 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 09 11:36:25 crc kubenswrapper[4745]: I1209 11:36:25.783796 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 09 11:36:25 crc kubenswrapper[4745]: I1209 11:36:25.912272 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 09 11:36:25 crc kubenswrapper[4745]: I1209 11:36:25.990106 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 09 11:36:26 crc kubenswrapper[4745]: I1209 11:36:26.207095 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 09 11:36:26 crc kubenswrapper[4745]: I1209 11:36:26.258202 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 09 11:36:26 crc kubenswrapper[4745]: I1209 11:36:26.372187 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 09 11:36:26 crc kubenswrapper[4745]: I1209 11:36:26.551655 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 09 11:36:26 crc kubenswrapper[4745]: I1209 11:36:26.745836 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 09 11:36:26 crc kubenswrapper[4745]: I1209 11:36:26.873811 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 09 11:36:26 crc kubenswrapper[4745]: I1209 11:36:26.906502 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 09 11:36:26 crc kubenswrapper[4745]: I1209 11:36:26.978172 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 09 11:36:27 crc kubenswrapper[4745]: I1209 11:36:27.118261 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 09 11:36:27 crc kubenswrapper[4745]: I1209 11:36:27.152127 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 09 11:36:27 crc kubenswrapper[4745]: I1209 11:36:27.204661 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 09 11:36:27 crc kubenswrapper[4745]: I1209 11:36:27.264946 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 09 11:36:27 crc kubenswrapper[4745]: I1209 11:36:27.316861 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 09 11:36:27 crc kubenswrapper[4745]: I1209 11:36:27.631166 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 09 11:36:27 crc kubenswrapper[4745]: I1209 11:36:27.678315 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 09 11:36:27 crc kubenswrapper[4745]: I1209 11:36:27.726609 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 09 11:36:27 crc kubenswrapper[4745]: I1209 11:36:27.734311 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 09 11:36:27 crc kubenswrapper[4745]: I1209 11:36:27.956679 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.108707 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.137302 4745 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.217934 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.353145 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.381568 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.417828 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.480230 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.504727 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.526965 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.615711 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.680494 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.723890 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.761867 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.783139 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.786221 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.854921 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.971726 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 09 11:36:28 crc kubenswrapper[4745]: I1209 11:36:28.981070 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 09 11:36:29 crc kubenswrapper[4745]: I1209 11:36:29.006123 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 09 11:36:29 crc kubenswrapper[4745]: I1209 11:36:29.113646 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 09 11:36:29 crc kubenswrapper[4745]: I1209 11:36:29.119579 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 09 11:36:29 crc kubenswrapper[4745]: I1209 11:36:29.158877 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 09 11:36:29 crc kubenswrapper[4745]: I1209 11:36:29.300193 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 09 11:36:29 crc kubenswrapper[4745]: I1209 11:36:29.305417 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 09 11:36:29 crc kubenswrapper[4745]: I1209 11:36:29.344453 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 09 11:36:29 crc kubenswrapper[4745]: I1209 11:36:29.398739 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 09 11:36:29 crc kubenswrapper[4745]: I1209 11:36:29.493341 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 09 11:36:29 crc kubenswrapper[4745]: I1209 11:36:29.548906 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 09 11:36:29 crc kubenswrapper[4745]: I1209 11:36:29.570363 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 09 11:36:29 crc kubenswrapper[4745]: I1209 11:36:29.661610 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 09 11:36:29 crc kubenswrapper[4745]: I1209 11:36:29.675133 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 09 11:36:29 crc kubenswrapper[4745]: I1209 11:36:29.680717 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 09 11:36:29 crc kubenswrapper[4745]: I1209 11:36:29.870571 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 09 11:36:29 crc kubenswrapper[4745]: I1209 11:36:29.936409 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.015369 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.031925 4745 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.032165 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=42.032098214 podStartE2EDuration="42.032098214s" podCreationTimestamp="2025-12-09 11:35:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:36:08.702098107 +0000 UTC m=+255.527299631" watchObservedRunningTime="2025-12-09 11:36:30.032098214 +0000 UTC m=+276.857299738" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.037732 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.037800 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rpddj","openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.038047 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48d4949-4889-4b14-95bd-37954bd71417" containerName="installer" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.038069 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48d4949-4889-4b14-95bd-37954bd71417" containerName="installer" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.038218 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d48d4949-4889-4b14-95bd-37954bd71417" containerName="installer" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.038623 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgrls","openshift-marketplace/certified-operators-zrmwh","openshift-authentication/oauth-openshift-558db77b4-j258s","openshift-marketplace/certified-operators-2mvnf","openshift-marketplace/community-operators-s5fp7","openshift-marketplace/redhat-operators-5fc6n","openshift-marketplace/marketplace-operator-79b997595-w5n22","openshift-marketplace/community-operators-zt7zb","openshift-marketplace/redhat-marketplace-xnwjq","openshift-marketplace/redhat-operators-cn2sj"] Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.038927 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cn2sj" podUID="daac685e-ee53-47ff-b3c4-7c999567e5cb" containerName="registry-server" containerID="cri-o://6b5688e7242e25472304483f57e0c7a070e57ca07927bbc59f38746e7bbfbfce" gracePeriod=30 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.038952 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rpddj" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.039273 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hgrls" podUID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" containerName="registry-server" containerID="cri-o://335f829451ea9ad72ccf7dfb933d6e123fa34c49a709a1f97c338da78d7499e3" gracePeriod=30 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.039361 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zt7zb" podUID="7e36127f-1987-4067-bf22-38a1bc134721" containerName="registry-server" containerID="cri-o://e4761aa888954734b8ee461d85a726bfec2867b0f73480d43d03b61ffb89b045" gracePeriod=30 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.039532 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zrmwh" podUID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" containerName="registry-server" containerID="cri-o://4e1cdcf8187f36276824b9ffa38843e940865f9cfa2fea7544fee4c009b76115" gracePeriod=30 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.039612 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" podUID="e5372cf2-1aac-4a33-96ae-f6e7e612195a" containerName="marketplace-operator" containerID="cri-o://2cac00cc44cdab96ada68f056ca4b064ff081b3dedac581cfcaf5750bd4e0b4a" gracePeriod=30 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.039716 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2mvnf" podUID="b1305324-76f0-474c-8933-599d9b6eaff4" containerName="registry-server" containerID="cri-o://4d2d623ebb49d765da10b0c661a6c339d9c0c5682e0c3ac7fed6cf539da10806" gracePeriod=30 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.039851 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5fc6n" podUID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" containerName="registry-server" containerID="cri-o://dd7be5e860d00f7d877001d8a14ccc62b5c4eb22d9ddee584d6b6ce770bf661b" gracePeriod=30 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.041412 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s5fp7" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" containerName="registry-server" containerID="cri-o://d9ae0b9ef202b84af9b66d4541629ac924c5576b57dfdcb3457898a8ab6199dc" gracePeriod=30 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.041649 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xnwjq" podUID="a216af0b-0937-4319-97bc-9d180389b873" containerName="registry-server" containerID="cri-o://dc9484e866b5e4ab3c62825d5ffdd13102693c6e9c335a7f03303aa2ea181d16" gracePeriod=30 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.047421 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.067853 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.067815414 podStartE2EDuration="21.067815414s" podCreationTimestamp="2025-12-09 11:36:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:36:30.064319719 +0000 UTC m=+276.889521253" watchObservedRunningTime="2025-12-09 11:36:30.067815414 +0000 UTC m=+276.893016938" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.083449 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.142659 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.174285 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d9ae0b9ef202b84af9b66d4541629ac924c5576b57dfdcb3457898a8ab6199dc is running failed: container process not found" containerID="d9ae0b9ef202b84af9b66d4541629ac924c5576b57dfdcb3457898a8ab6199dc" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.175051 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d9ae0b9ef202b84af9b66d4541629ac924c5576b57dfdcb3457898a8ab6199dc is running failed: container process not found" containerID="d9ae0b9ef202b84af9b66d4541629ac924c5576b57dfdcb3457898a8ab6199dc" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.175788 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d9ae0b9ef202b84af9b66d4541629ac924c5576b57dfdcb3457898a8ab6199dc is running failed: container process not found" containerID="d9ae0b9ef202b84af9b66d4541629ac924c5576b57dfdcb3457898a8ab6199dc" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.175832 4745 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d9ae0b9ef202b84af9b66d4541629ac924c5576b57dfdcb3457898a8ab6199dc is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-s5fp7" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" containerName="registry-server" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.195106 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbwlf\" (UniqueName: \"kubernetes.io/projected/8eb53c13-6b59-4685-a3c0-e925a24f1f24-kube-api-access-rbwlf\") pod \"marketplace-operator-79b997595-rpddj\" (UID: \"8eb53c13-6b59-4685-a3c0-e925a24f1f24\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpddj" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.195193 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8eb53c13-6b59-4685-a3c0-e925a24f1f24-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rpddj\" (UID: \"8eb53c13-6b59-4685-a3c0-e925a24f1f24\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpddj" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.195242 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8eb53c13-6b59-4685-a3c0-e925a24f1f24-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rpddj\" (UID: \"8eb53c13-6b59-4685-a3c0-e925a24f1f24\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpddj" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.253367 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.296002 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbwlf\" (UniqueName: \"kubernetes.io/projected/8eb53c13-6b59-4685-a3c0-e925a24f1f24-kube-api-access-rbwlf\") pod \"marketplace-operator-79b997595-rpddj\" (UID: \"8eb53c13-6b59-4685-a3c0-e925a24f1f24\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpddj" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.296056 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8eb53c13-6b59-4685-a3c0-e925a24f1f24-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rpddj\" (UID: \"8eb53c13-6b59-4685-a3c0-e925a24f1f24\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpddj" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.296085 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8eb53c13-6b59-4685-a3c0-e925a24f1f24-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rpddj\" (UID: \"8eb53c13-6b59-4685-a3c0-e925a24f1f24\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpddj" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.297286 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8eb53c13-6b59-4685-a3c0-e925a24f1f24-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rpddj\" (UID: \"8eb53c13-6b59-4685-a3c0-e925a24f1f24\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpddj" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.300906 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.309390 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8eb53c13-6b59-4685-a3c0-e925a24f1f24-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rpddj\" (UID: \"8eb53c13-6b59-4685-a3c0-e925a24f1f24\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpddj" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.319329 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbwlf\" (UniqueName: \"kubernetes.io/projected/8eb53c13-6b59-4685-a3c0-e925a24f1f24-kube-api-access-rbwlf\") pod \"marketplace-operator-79b997595-rpddj\" (UID: \"8eb53c13-6b59-4685-a3c0-e925a24f1f24\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpddj" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.373221 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rpddj" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.404485 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.439003 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e1cdcf8187f36276824b9ffa38843e940865f9cfa2fea7544fee4c009b76115 is running failed: container process not found" containerID="4e1cdcf8187f36276824b9ffa38843e940865f9cfa2fea7544fee4c009b76115" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.439555 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e1cdcf8187f36276824b9ffa38843e940865f9cfa2fea7544fee4c009b76115 is running failed: container process not found" containerID="4e1cdcf8187f36276824b9ffa38843e940865f9cfa2fea7544fee4c009b76115" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.439955 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e1cdcf8187f36276824b9ffa38843e940865f9cfa2fea7544fee4c009b76115 is running failed: container process not found" containerID="4e1cdcf8187f36276824b9ffa38843e940865f9cfa2fea7544fee4c009b76115" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.440005 4745 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e1cdcf8187f36276824b9ffa38843e940865f9cfa2fea7544fee4c009b76115 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-zrmwh" podUID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" containerName="registry-server" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.557627 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.626989 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.631258 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.667531 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e4761aa888954734b8ee461d85a726bfec2867b0f73480d43d03b61ffb89b045 is running failed: container process not found" containerID="e4761aa888954734b8ee461d85a726bfec2867b0f73480d43d03b61ffb89b045" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.668128 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e4761aa888954734b8ee461d85a726bfec2867b0f73480d43d03b61ffb89b045 is running failed: container process not found" containerID="e4761aa888954734b8ee461d85a726bfec2867b0f73480d43d03b61ffb89b045" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.668500 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e4761aa888954734b8ee461d85a726bfec2867b0f73480d43d03b61ffb89b045 is running failed: container process not found" containerID="e4761aa888954734b8ee461d85a726bfec2867b0f73480d43d03b61ffb89b045" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.668605 4745 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e4761aa888954734b8ee461d85a726bfec2867b0f73480d43d03b61ffb89b045 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-zt7zb" podUID="7e36127f-1987-4067-bf22-38a1bc134721" containerName="registry-server" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.689878 4745 generic.go:334] "Generic (PLEG): container finished" podID="b1305324-76f0-474c-8933-599d9b6eaff4" containerID="4d2d623ebb49d765da10b0c661a6c339d9c0c5682e0c3ac7fed6cf539da10806" exitCode=0 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.689953 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mvnf" event={"ID":"b1305324-76f0-474c-8933-599d9b6eaff4","Type":"ContainerDied","Data":"4d2d623ebb49d765da10b0c661a6c339d9c0c5682e0c3ac7fed6cf539da10806"} Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.692894 4745 generic.go:334] "Generic (PLEG): container finished" podID="daac685e-ee53-47ff-b3c4-7c999567e5cb" containerID="6b5688e7242e25472304483f57e0c7a070e57ca07927bbc59f38746e7bbfbfce" exitCode=0 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.692970 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn2sj" event={"ID":"daac685e-ee53-47ff-b3c4-7c999567e5cb","Type":"ContainerDied","Data":"6b5688e7242e25472304483f57e0c7a070e57ca07927bbc59f38746e7bbfbfce"} Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.696274 4745 generic.go:334] "Generic (PLEG): container finished" podID="a216af0b-0937-4319-97bc-9d180389b873" containerID="dc9484e866b5e4ab3c62825d5ffdd13102693c6e9c335a7f03303aa2ea181d16" exitCode=0 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.696337 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xnwjq" event={"ID":"a216af0b-0937-4319-97bc-9d180389b873","Type":"ContainerDied","Data":"dc9484e866b5e4ab3c62825d5ffdd13102693c6e9c335a7f03303aa2ea181d16"} Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.696356 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xnwjq" event={"ID":"a216af0b-0937-4319-97bc-9d180389b873","Type":"ContainerDied","Data":"17440cb627e36b234de495ceced756cfd393a4cab136392c0659c509379aa44c"} Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.696367 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17440cb627e36b234de495ceced756cfd393a4cab136392c0659c509379aa44c" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.700559 4745 generic.go:334] "Generic (PLEG): container finished" podID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" containerID="335f829451ea9ad72ccf7dfb933d6e123fa34c49a709a1f97c338da78d7499e3" exitCode=0 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.700807 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgrls" event={"ID":"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05","Type":"ContainerDied","Data":"335f829451ea9ad72ccf7dfb933d6e123fa34c49a709a1f97c338da78d7499e3"} Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.704390 4745 generic.go:334] "Generic (PLEG): container finished" podID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" containerID="dd7be5e860d00f7d877001d8a14ccc62b5c4eb22d9ddee584d6b6ce770bf661b" exitCode=0 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.704478 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fc6n" event={"ID":"0bf72de4-c628-4b28-9bfc-1d6a874a6575","Type":"ContainerDied","Data":"dd7be5e860d00f7d877001d8a14ccc62b5c4eb22d9ddee584d6b6ce770bf661b"} Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.707060 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.719492 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt7zb" event={"ID":"7e36127f-1987-4067-bf22-38a1bc134721","Type":"ContainerDied","Data":"e4761aa888954734b8ee461d85a726bfec2867b0f73480d43d03b61ffb89b045"} Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.720921 4745 generic.go:334] "Generic (PLEG): container finished" podID="7e36127f-1987-4067-bf22-38a1bc134721" containerID="e4761aa888954734b8ee461d85a726bfec2867b0f73480d43d03b61ffb89b045" exitCode=0 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.725625 4745 generic.go:334] "Generic (PLEG): container finished" podID="e5372cf2-1aac-4a33-96ae-f6e7e612195a" containerID="2cac00cc44cdab96ada68f056ca4b064ff081b3dedac581cfcaf5750bd4e0b4a" exitCode=0 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.725676 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" event={"ID":"e5372cf2-1aac-4a33-96ae-f6e7e612195a","Type":"ContainerDied","Data":"2cac00cc44cdab96ada68f056ca4b064ff081b3dedac581cfcaf5750bd4e0b4a"} Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.727668 4745 generic.go:334] "Generic (PLEG): container finished" podID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" containerID="4e1cdcf8187f36276824b9ffa38843e940865f9cfa2fea7544fee4c009b76115" exitCode=0 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.727713 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrmwh" event={"ID":"e2ab340c-642b-4f8d-bc1f-adebf5e79418","Type":"ContainerDied","Data":"4e1cdcf8187f36276824b9ffa38843e940865f9cfa2fea7544fee4c009b76115"} Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.730160 4745 generic.go:334] "Generic (PLEG): container finished" podID="81940df1-47e0-46fa-9215-5751defbf8c0" containerID="d9ae0b9ef202b84af9b66d4541629ac924c5576b57dfdcb3457898a8ab6199dc" exitCode=0 Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.730933 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5fp7" event={"ID":"81940df1-47e0-46fa-9215-5751defbf8c0","Type":"ContainerDied","Data":"d9ae0b9ef202b84af9b66d4541629ac924c5576b57dfdcb3457898a8ab6199dc"} Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.766779 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.775891 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.784740 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4d2d623ebb49d765da10b0c661a6c339d9c0c5682e0c3ac7fed6cf539da10806 is running failed: container process not found" containerID="4d2d623ebb49d765da10b0c661a6c339d9c0c5682e0c3ac7fed6cf539da10806" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.785195 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4d2d623ebb49d765da10b0c661a6c339d9c0c5682e0c3ac7fed6cf539da10806 is running failed: container process not found" containerID="4d2d623ebb49d765da10b0c661a6c339d9c0c5682e0c3ac7fed6cf539da10806" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.785713 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4d2d623ebb49d765da10b0c661a6c339d9c0c5682e0c3ac7fed6cf539da10806 is running failed: container process not found" containerID="4d2d623ebb49d765da10b0c661a6c339d9c0c5682e0c3ac7fed6cf539da10806" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 11:36:30 crc kubenswrapper[4745]: E1209 11:36:30.785782 4745 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4d2d623ebb49d765da10b0c661a6c339d9c0c5682e0c3ac7fed6cf539da10806 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-2mvnf" podUID="b1305324-76f0-474c-8933-599d9b6eaff4" containerName="registry-server" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.811441 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a216af0b-0937-4319-97bc-9d180389b873-utilities\") pod \"a216af0b-0937-4319-97bc-9d180389b873\" (UID: \"a216af0b-0937-4319-97bc-9d180389b873\") " Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.811581 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwmhg\" (UniqueName: \"kubernetes.io/projected/a216af0b-0937-4319-97bc-9d180389b873-kube-api-access-wwmhg\") pod \"a216af0b-0937-4319-97bc-9d180389b873\" (UID: \"a216af0b-0937-4319-97bc-9d180389b873\") " Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.811685 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a216af0b-0937-4319-97bc-9d180389b873-catalog-content\") pod \"a216af0b-0937-4319-97bc-9d180389b873\" (UID: \"a216af0b-0937-4319-97bc-9d180389b873\") " Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.813039 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a216af0b-0937-4319-97bc-9d180389b873-utilities" (OuterVolumeSpecName: "utilities") pod "a216af0b-0937-4319-97bc-9d180389b873" (UID: "a216af0b-0937-4319-97bc-9d180389b873"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.820906 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a216af0b-0937-4319-97bc-9d180389b873-kube-api-access-wwmhg" (OuterVolumeSpecName: "kube-api-access-wwmhg") pod "a216af0b-0937-4319-97bc-9d180389b873" (UID: "a216af0b-0937-4319-97bc-9d180389b873"). InnerVolumeSpecName "kube-api-access-wwmhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.826810 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.828539 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.830249 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.842552 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a216af0b-0937-4319-97bc-9d180389b873-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a216af0b-0937-4319-97bc-9d180389b873" (UID: "a216af0b-0937-4319-97bc-9d180389b873"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.861953 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.906956 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.913070 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a216af0b-0937-4319-97bc-9d180389b873-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.913111 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a216af0b-0937-4319-97bc-9d180389b873-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.913124 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwmhg\" (UniqueName: \"kubernetes.io/projected/a216af0b-0937-4319-97bc-9d180389b873-kube-api-access-wwmhg\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.917312 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.924650 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.934690 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:36:30 crc kubenswrapper[4745]: I1209 11:36:30.950863 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.006281 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rpddj"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.018027 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-utilities\") pod \"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05\" (UID: \"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.018090 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81940df1-47e0-46fa-9215-5751defbf8c0-utilities\") pod \"81940df1-47e0-46fa-9215-5751defbf8c0\" (UID: \"81940df1-47e0-46fa-9215-5751defbf8c0\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.018129 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-catalog-content\") pod \"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05\" (UID: \"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.018169 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2ab340c-642b-4f8d-bc1f-adebf5e79418-catalog-content\") pod \"e2ab340c-642b-4f8d-bc1f-adebf5e79418\" (UID: \"e2ab340c-642b-4f8d-bc1f-adebf5e79418\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.018246 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2ab340c-642b-4f8d-bc1f-adebf5e79418-utilities\") pod \"e2ab340c-642b-4f8d-bc1f-adebf5e79418\" (UID: \"e2ab340c-642b-4f8d-bc1f-adebf5e79418\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.018270 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81940df1-47e0-46fa-9215-5751defbf8c0-catalog-content\") pod \"81940df1-47e0-46fa-9215-5751defbf8c0\" (UID: \"81940df1-47e0-46fa-9215-5751defbf8c0\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.018307 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd6g6\" (UniqueName: \"kubernetes.io/projected/e2ab340c-642b-4f8d-bc1f-adebf5e79418-kube-api-access-dd6g6\") pod \"e2ab340c-642b-4f8d-bc1f-adebf5e79418\" (UID: \"e2ab340c-642b-4f8d-bc1f-adebf5e79418\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.018350 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4dtm\" (UniqueName: \"kubernetes.io/projected/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-kube-api-access-d4dtm\") pod \"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05\" (UID: \"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.018391 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlr2m\" (UniqueName: \"kubernetes.io/projected/81940df1-47e0-46fa-9215-5751defbf8c0-kube-api-access-dlr2m\") pod \"81940df1-47e0-46fa-9215-5751defbf8c0\" (UID: \"81940df1-47e0-46fa-9215-5751defbf8c0\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.020494 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81940df1-47e0-46fa-9215-5751defbf8c0-utilities" (OuterVolumeSpecName: "utilities") pod "81940df1-47e0-46fa-9215-5751defbf8c0" (UID: "81940df1-47e0-46fa-9215-5751defbf8c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.020721 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-utilities" (OuterVolumeSpecName: "utilities") pod "d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" (UID: "d2d47c68-bfd3-4a99-afbf-7fe4f1478f05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.023266 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2ab340c-642b-4f8d-bc1f-adebf5e79418-utilities" (OuterVolumeSpecName: "utilities") pod "e2ab340c-642b-4f8d-bc1f-adebf5e79418" (UID: "e2ab340c-642b-4f8d-bc1f-adebf5e79418"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.023523 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ab340c-642b-4f8d-bc1f-adebf5e79418-kube-api-access-dd6g6" (OuterVolumeSpecName: "kube-api-access-dd6g6") pod "e2ab340c-642b-4f8d-bc1f-adebf5e79418" (UID: "e2ab340c-642b-4f8d-bc1f-adebf5e79418"). InnerVolumeSpecName "kube-api-access-dd6g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.027315 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81940df1-47e0-46fa-9215-5751defbf8c0-kube-api-access-dlr2m" (OuterVolumeSpecName: "kube-api-access-dlr2m") pod "81940df1-47e0-46fa-9215-5751defbf8c0" (UID: "81940df1-47e0-46fa-9215-5751defbf8c0"). InnerVolumeSpecName "kube-api-access-dlr2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.035682 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.050811 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-kube-api-access-d4dtm" (OuterVolumeSpecName: "kube-api-access-d4dtm") pod "d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" (UID: "d2d47c68-bfd3-4a99-afbf-7fe4f1478f05"). InnerVolumeSpecName "kube-api-access-d4dtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.063194 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.067464 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" (UID: "d2d47c68-bfd3-4a99-afbf-7fe4f1478f05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.073999 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.086014 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2ab340c-642b-4f8d-bc1f-adebf5e79418-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2ab340c-642b-4f8d-bc1f-adebf5e79418" (UID: "e2ab340c-642b-4f8d-bc1f-adebf5e79418"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.090634 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.090998 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.119415 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81940df1-47e0-46fa-9215-5751defbf8c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81940df1-47e0-46fa-9215-5751defbf8c0" (UID: "81940df1-47e0-46fa-9215-5751defbf8c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123146 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b2x2\" (UniqueName: \"kubernetes.io/projected/e5372cf2-1aac-4a33-96ae-f6e7e612195a-kube-api-access-2b2x2\") pod \"e5372cf2-1aac-4a33-96ae-f6e7e612195a\" (UID: \"e5372cf2-1aac-4a33-96ae-f6e7e612195a\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123207 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daac685e-ee53-47ff-b3c4-7c999567e5cb-utilities\") pod \"daac685e-ee53-47ff-b3c4-7c999567e5cb\" (UID: \"daac685e-ee53-47ff-b3c4-7c999567e5cb\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123266 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5372cf2-1aac-4a33-96ae-f6e7e612195a-marketplace-trusted-ca\") pod \"e5372cf2-1aac-4a33-96ae-f6e7e612195a\" (UID: \"e5372cf2-1aac-4a33-96ae-f6e7e612195a\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123291 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daac685e-ee53-47ff-b3c4-7c999567e5cb-catalog-content\") pod \"daac685e-ee53-47ff-b3c4-7c999567e5cb\" (UID: \"daac685e-ee53-47ff-b3c4-7c999567e5cb\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123365 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1305324-76f0-474c-8933-599d9b6eaff4-utilities\") pod \"b1305324-76f0-474c-8933-599d9b6eaff4\" (UID: \"b1305324-76f0-474c-8933-599d9b6eaff4\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123396 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq8hn\" (UniqueName: \"kubernetes.io/projected/daac685e-ee53-47ff-b3c4-7c999567e5cb-kube-api-access-xq8hn\") pod \"daac685e-ee53-47ff-b3c4-7c999567e5cb\" (UID: \"daac685e-ee53-47ff-b3c4-7c999567e5cb\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123450 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e5372cf2-1aac-4a33-96ae-f6e7e612195a-marketplace-operator-metrics\") pod \"e5372cf2-1aac-4a33-96ae-f6e7e612195a\" (UID: \"e5372cf2-1aac-4a33-96ae-f6e7e612195a\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123477 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1305324-76f0-474c-8933-599d9b6eaff4-catalog-content\") pod \"b1305324-76f0-474c-8933-599d9b6eaff4\" (UID: \"b1305324-76f0-474c-8933-599d9b6eaff4\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123500 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtvxs\" (UniqueName: \"kubernetes.io/projected/b1305324-76f0-474c-8933-599d9b6eaff4-kube-api-access-xtvxs\") pod \"b1305324-76f0-474c-8933-599d9b6eaff4\" (UID: \"b1305324-76f0-474c-8933-599d9b6eaff4\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123862 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2ab340c-642b-4f8d-bc1f-adebf5e79418-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123884 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81940df1-47e0-46fa-9215-5751defbf8c0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123897 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd6g6\" (UniqueName: \"kubernetes.io/projected/e2ab340c-642b-4f8d-bc1f-adebf5e79418-kube-api-access-dd6g6\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123909 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4dtm\" (UniqueName: \"kubernetes.io/projected/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-kube-api-access-d4dtm\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123919 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlr2m\" (UniqueName: \"kubernetes.io/projected/81940df1-47e0-46fa-9215-5751defbf8c0-kube-api-access-dlr2m\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123930 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123939 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81940df1-47e0-46fa-9215-5751defbf8c0-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123948 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.123956 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2ab340c-642b-4f8d-bc1f-adebf5e79418-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.129237 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1305324-76f0-474c-8933-599d9b6eaff4-kube-api-access-xtvxs" (OuterVolumeSpecName: "kube-api-access-xtvxs") pod "b1305324-76f0-474c-8933-599d9b6eaff4" (UID: "b1305324-76f0-474c-8933-599d9b6eaff4"). InnerVolumeSpecName "kube-api-access-xtvxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.130420 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1305324-76f0-474c-8933-599d9b6eaff4-utilities" (OuterVolumeSpecName: "utilities") pod "b1305324-76f0-474c-8933-599d9b6eaff4" (UID: "b1305324-76f0-474c-8933-599d9b6eaff4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.134090 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daac685e-ee53-47ff-b3c4-7c999567e5cb-kube-api-access-xq8hn" (OuterVolumeSpecName: "kube-api-access-xq8hn") pod "daac685e-ee53-47ff-b3c4-7c999567e5cb" (UID: "daac685e-ee53-47ff-b3c4-7c999567e5cb"). InnerVolumeSpecName "kube-api-access-xq8hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.142075 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daac685e-ee53-47ff-b3c4-7c999567e5cb-utilities" (OuterVolumeSpecName: "utilities") pod "daac685e-ee53-47ff-b3c4-7c999567e5cb" (UID: "daac685e-ee53-47ff-b3c4-7c999567e5cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.142137 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5372cf2-1aac-4a33-96ae-f6e7e612195a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "e5372cf2-1aac-4a33-96ae-f6e7e612195a" (UID: "e5372cf2-1aac-4a33-96ae-f6e7e612195a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.143304 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5372cf2-1aac-4a33-96ae-f6e7e612195a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "e5372cf2-1aac-4a33-96ae-f6e7e612195a" (UID: "e5372cf2-1aac-4a33-96ae-f6e7e612195a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.144582 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.148750 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5372cf2-1aac-4a33-96ae-f6e7e612195a-kube-api-access-2b2x2" (OuterVolumeSpecName: "kube-api-access-2b2x2") pod "e5372cf2-1aac-4a33-96ae-f6e7e612195a" (UID: "e5372cf2-1aac-4a33-96ae-f6e7e612195a"). InnerVolumeSpecName "kube-api-access-2b2x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.151222 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.155978 4745 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.156277 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://40905afca81ef5df06ab4dd5547a4da1fac05889fa1f0d8d234afc59ce3cae1a" gracePeriod=5 Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.225572 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf72de4-c628-4b28-9bfc-1d6a874a6575-utilities\") pod \"0bf72de4-c628-4b28-9bfc-1d6a874a6575\" (UID: \"0bf72de4-c628-4b28-9bfc-1d6a874a6575\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.225688 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhcfj\" (UniqueName: \"kubernetes.io/projected/7e36127f-1987-4067-bf22-38a1bc134721-kube-api-access-nhcfj\") pod \"7e36127f-1987-4067-bf22-38a1bc134721\" (UID: \"7e36127f-1987-4067-bf22-38a1bc134721\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.225726 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e36127f-1987-4067-bf22-38a1bc134721-utilities\") pod \"7e36127f-1987-4067-bf22-38a1bc134721\" (UID: \"7e36127f-1987-4067-bf22-38a1bc134721\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.225761 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e36127f-1987-4067-bf22-38a1bc134721-catalog-content\") pod \"7e36127f-1987-4067-bf22-38a1bc134721\" (UID: \"7e36127f-1987-4067-bf22-38a1bc134721\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.225796 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9svms\" (UniqueName: \"kubernetes.io/projected/0bf72de4-c628-4b28-9bfc-1d6a874a6575-kube-api-access-9svms\") pod \"0bf72de4-c628-4b28-9bfc-1d6a874a6575\" (UID: \"0bf72de4-c628-4b28-9bfc-1d6a874a6575\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.225891 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf72de4-c628-4b28-9bfc-1d6a874a6575-catalog-content\") pod \"0bf72de4-c628-4b28-9bfc-1d6a874a6575\" (UID: \"0bf72de4-c628-4b28-9bfc-1d6a874a6575\") " Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.226381 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bf72de4-c628-4b28-9bfc-1d6a874a6575-utilities" (OuterVolumeSpecName: "utilities") pod "0bf72de4-c628-4b28-9bfc-1d6a874a6575" (UID: "0bf72de4-c628-4b28-9bfc-1d6a874a6575"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.226707 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b2x2\" (UniqueName: \"kubernetes.io/projected/e5372cf2-1aac-4a33-96ae-f6e7e612195a-kube-api-access-2b2x2\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.226728 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daac685e-ee53-47ff-b3c4-7c999567e5cb-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.226738 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf72de4-c628-4b28-9bfc-1d6a874a6575-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.227322 4745 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5372cf2-1aac-4a33-96ae-f6e7e612195a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.227354 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1305324-76f0-474c-8933-599d9b6eaff4-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.227364 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq8hn\" (UniqueName: \"kubernetes.io/projected/daac685e-ee53-47ff-b3c4-7c999567e5cb-kube-api-access-xq8hn\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.227375 4745 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e5372cf2-1aac-4a33-96ae-f6e7e612195a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.227402 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtvxs\" (UniqueName: \"kubernetes.io/projected/b1305324-76f0-474c-8933-599d9b6eaff4-kube-api-access-xtvxs\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.228912 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e36127f-1987-4067-bf22-38a1bc134721-utilities" (OuterVolumeSpecName: "utilities") pod "7e36127f-1987-4067-bf22-38a1bc134721" (UID: "7e36127f-1987-4067-bf22-38a1bc134721"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.233243 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bf72de4-c628-4b28-9bfc-1d6a874a6575-kube-api-access-9svms" (OuterVolumeSpecName: "kube-api-access-9svms") pod "0bf72de4-c628-4b28-9bfc-1d6a874a6575" (UID: "0bf72de4-c628-4b28-9bfc-1d6a874a6575"). InnerVolumeSpecName "kube-api-access-9svms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.233785 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e36127f-1987-4067-bf22-38a1bc134721-kube-api-access-nhcfj" (OuterVolumeSpecName: "kube-api-access-nhcfj") pod "7e36127f-1987-4067-bf22-38a1bc134721" (UID: "7e36127f-1987-4067-bf22-38a1bc134721"). InnerVolumeSpecName "kube-api-access-nhcfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.236811 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1305324-76f0-474c-8933-599d9b6eaff4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1305324-76f0-474c-8933-599d9b6eaff4" (UID: "b1305324-76f0-474c-8933-599d9b6eaff4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.261704 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.282161 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daac685e-ee53-47ff-b3c4-7c999567e5cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "daac685e-ee53-47ff-b3c4-7c999567e5cb" (UID: "daac685e-ee53-47ff-b3c4-7c999567e5cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.317211 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e36127f-1987-4067-bf22-38a1bc134721-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e36127f-1987-4067-bf22-38a1bc134721" (UID: "7e36127f-1987-4067-bf22-38a1bc134721"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.328594 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daac685e-ee53-47ff-b3c4-7c999567e5cb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.328635 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhcfj\" (UniqueName: \"kubernetes.io/projected/7e36127f-1987-4067-bf22-38a1bc134721-kube-api-access-nhcfj\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.328647 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e36127f-1987-4067-bf22-38a1bc134721-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.328655 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e36127f-1987-4067-bf22-38a1bc134721-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.328664 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9svms\" (UniqueName: \"kubernetes.io/projected/0bf72de4-c628-4b28-9bfc-1d6a874a6575-kube-api-access-9svms\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.328674 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1305324-76f0-474c-8933-599d9b6eaff4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.369716 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bf72de4-c628-4b28-9bfc-1d6a874a6575-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bf72de4-c628-4b28-9bfc-1d6a874a6575" (UID: "0bf72de4-c628-4b28-9bfc-1d6a874a6575"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.430327 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf72de4-c628-4b28-9bfc-1d6a874a6575-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.479402 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.537566 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.555358 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.588989 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.602767 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.636860 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.654024 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.709344 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.711990 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.736946 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.741735 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt7zb" event={"ID":"7e36127f-1987-4067-bf22-38a1bc134721","Type":"ContainerDied","Data":"526d902a9bdc618f0334d274226e0085bd68f5545106e478b23b06fd694bb71e"} Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.741920 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zt7zb" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.741929 4745 scope.go:117] "RemoveContainer" containerID="e4761aa888954734b8ee461d85a726bfec2867b0f73480d43d03b61ffb89b045" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.744027 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" event={"ID":"e5372cf2-1aac-4a33-96ae-f6e7e612195a","Type":"ContainerDied","Data":"3d2d0e1f33e5938d57965e9d7feae0e85b852eb8300a7556a07a4ee9b90054ee"} Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.744309 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w5n22" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.751267 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrmwh" event={"ID":"e2ab340c-642b-4f8d-bc1f-adebf5e79418","Type":"ContainerDied","Data":"28742e5cd1422fb841f565d0f56f476bc100a285b0d2584cb0ffa86173928ecb"} Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.751432 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrmwh" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.755204 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgrls" event={"ID":"d2d47c68-bfd3-4a99-afbf-7fe4f1478f05","Type":"ContainerDied","Data":"af6d5eb0b447bc27f63bb597ade46cba090da2e9e521b0673c6df940a4b116de"} Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.755316 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgrls" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.761840 4745 scope.go:117] "RemoveContainer" containerID="1d49bfa392e9c2e6cccecdc0beb0eb5490a9ab33614a8f375f652f5507ab1010" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.764416 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fc6n" event={"ID":"0bf72de4-c628-4b28-9bfc-1d6a874a6575","Type":"ContainerDied","Data":"0c58a649554864a5320817d2f8db37d32ec6977ba7f9073021eda274163e66a8"} Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.764530 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fc6n" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.769899 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mvnf" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.770723 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mvnf" event={"ID":"b1305324-76f0-474c-8933-599d9b6eaff4","Type":"ContainerDied","Data":"cde86d328f67b09dccf2b15257300577b452a9dced8449be9e8211db62682165"} Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.770846 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zt7zb"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.773182 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn2sj" event={"ID":"daac685e-ee53-47ff-b3c4-7c999567e5cb","Type":"ContainerDied","Data":"be2c85a59a5b6e26ccbeb03a02a9d6171db2fb57393b87b5f206c10910c839d1"} Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.773370 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn2sj" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.774798 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zt7zb"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.775263 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rpddj" event={"ID":"8eb53c13-6b59-4685-a3c0-e925a24f1f24","Type":"ContainerStarted","Data":"367b3ba51ab624ab8cb46f973a4dc931775ac049cf7576b3b02f1e83c1fd14f5"} Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.775304 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rpddj" event={"ID":"8eb53c13-6b59-4685-a3c0-e925a24f1f24","Type":"ContainerStarted","Data":"4b3e78602f695921b5372bf53073c1f47bf2f6d067c57fae2a3337fe7b6d9c71"} Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.776601 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rpddj" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.781156 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.782611 4745 scope.go:117] "RemoveContainer" containerID="850e0ce396bfb25e9139603a98ea9dee8197373b47d6e2a9bcf6a8eb7d72702b" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.783767 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rpddj" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.784150 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xnwjq" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.784644 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5fp7" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.784636 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5fp7" event={"ID":"81940df1-47e0-46fa-9215-5751defbf8c0","Type":"ContainerDied","Data":"b42afdf27f5dfb60e05c782b8ac0939dcb7f0e7b854d9b00d8810db002d9d4dc"} Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.799214 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w5n22"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.799538 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.806145 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w5n22"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.812043 4745 scope.go:117] "RemoveContainer" containerID="2cac00cc44cdab96ada68f056ca4b064ff081b3dedac581cfcaf5750bd4e0b4a" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.816338 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rpddj" podStartSLOduration=2.816325438 podStartE2EDuration="2.816325438s" podCreationTimestamp="2025-12-09 11:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:36:31.81417535 +0000 UTC m=+278.639376874" watchObservedRunningTime="2025-12-09 11:36:31.816325438 +0000 UTC m=+278.641526972" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.837504 4745 scope.go:117] "RemoveContainer" containerID="4e1cdcf8187f36276824b9ffa38843e940865f9cfa2fea7544fee4c009b76115" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.842631 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zrmwh"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.845902 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zrmwh"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.856952 4745 scope.go:117] "RemoveContainer" containerID="89ceb3b77118bb1fbf2ea2a25637a57363263360760ea2ea52466c7e7081dbc8" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.857219 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5fc6n"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.859616 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5fc6n"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.869306 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgrls"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.879329 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgrls"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.879978 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.887376 4745 scope.go:117] "RemoveContainer" containerID="d88f7982963daf715e820057f69d7639aa7334698d19875c4552890502f15c57" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.887707 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s5fp7"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.897662 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s5fp7"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.907069 4745 scope.go:117] "RemoveContainer" containerID="335f829451ea9ad72ccf7dfb933d6e123fa34c49a709a1f97c338da78d7499e3" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.912774 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cn2sj"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.916009 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cn2sj"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.920188 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2mvnf"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.923724 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2mvnf"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.930608 4745 scope.go:117] "RemoveContainer" containerID="46a809d4c66bc7ea55a1da13b1458e33caf7c789099e7fbb107ce2454c0c9ae7" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.931788 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xnwjq"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.935167 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xnwjq"] Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.954756 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.955014 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.960438 4745 scope.go:117] "RemoveContainer" containerID="6407bee20a614bc4fdbd3ebccd99d41430393994b48b0a4fa7abc9708b4e67da" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.977187 4745 scope.go:117] "RemoveContainer" containerID="dd7be5e860d00f7d877001d8a14ccc62b5c4eb22d9ddee584d6b6ce770bf661b" Dec 09 11:36:31 crc kubenswrapper[4745]: I1209 11:36:31.999848 4745 scope.go:117] "RemoveContainer" containerID="490a87bc46222b454c81fd43d17a2db78d79721460c43c3075792562ac79cba6" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.019462 4745 scope.go:117] "RemoveContainer" containerID="3af8ed8234440e4b4b66c4b4f1aa037d6c28c663c220f11fd5a48e0af8137604" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.041803 4745 scope.go:117] "RemoveContainer" containerID="4d2d623ebb49d765da10b0c661a6c339d9c0c5682e0c3ac7fed6cf539da10806" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.060122 4745 scope.go:117] "RemoveContainer" containerID="769394f4d78b8f966c0a17068097b1be84b808ad404d2dd6db7e98fc46911e1c" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.068986 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.076318 4745 scope.go:117] "RemoveContainer" containerID="1f0808138f4182b8b39ed2f8d1e3c38a9bed54a3509412aea7c52ae6e6da84b7" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.086725 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.090103 4745 scope.go:117] "RemoveContainer" containerID="6b5688e7242e25472304483f57e0c7a070e57ca07927bbc59f38746e7bbfbfce" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.107950 4745 scope.go:117] "RemoveContainer" containerID="30649c38f18402049dfb700591e74f1ce2fad89f52934e8243fd4cf1f6a300aa" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.141139 4745 scope.go:117] "RemoveContainer" containerID="9201bd8dbc866471141fa095357a5f7d41a07aafc608c8e0a84572e1a9673e9e" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.158646 4745 scope.go:117] "RemoveContainer" containerID="d9ae0b9ef202b84af9b66d4541629ac924c5576b57dfdcb3457898a8ab6199dc" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.172077 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.180721 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.198105 4745 scope.go:117] "RemoveContainer" containerID="95257d43ff67112399a0721624d89636a6ff999474db1d908d5e8ac11b40122c" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.204451 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.214660 4745 scope.go:117] "RemoveContainer" containerID="7575e78a483a721aad8ea590d9ab4503977a81b22bd5a99226bb766c97acbeac" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.280500 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.371605 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.453283 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.480061 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.568100 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.580236 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.616726 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.691115 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.740729 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.751328 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.854784 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.865027 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.915792 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 09 11:36:32 crc kubenswrapper[4745]: I1209 11:36:32.942712 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.096046 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.193471 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.226122 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.374875 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.380394 4745 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.475307 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.563294 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" path="/var/lib/kubelet/pods/0bf72de4-c628-4b28-9bfc-1d6a874a6575/volumes" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.564111 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e36127f-1987-4067-bf22-38a1bc134721" path="/var/lib/kubelet/pods/7e36127f-1987-4067-bf22-38a1bc134721/volumes" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.564860 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" path="/var/lib/kubelet/pods/81940df1-47e0-46fa-9215-5751defbf8c0/volumes" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.566065 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a216af0b-0937-4319-97bc-9d180389b873" path="/var/lib/kubelet/pods/a216af0b-0937-4319-97bc-9d180389b873/volumes" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.566669 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1305324-76f0-474c-8933-599d9b6eaff4" path="/var/lib/kubelet/pods/b1305324-76f0-474c-8933-599d9b6eaff4/volumes" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.567684 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" path="/var/lib/kubelet/pods/d2d47c68-bfd3-4a99-afbf-7fe4f1478f05/volumes" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.568274 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daac685e-ee53-47ff-b3c4-7c999567e5cb" path="/var/lib/kubelet/pods/daac685e-ee53-47ff-b3c4-7c999567e5cb/volumes" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.568892 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" path="/var/lib/kubelet/pods/e2ab340c-642b-4f8d-bc1f-adebf5e79418/volumes" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.569875 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5372cf2-1aac-4a33-96ae-f6e7e612195a" path="/var/lib/kubelet/pods/e5372cf2-1aac-4a33-96ae-f6e7e612195a/volumes" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.630236 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.723797 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.728424 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.781107 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.879602 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.968580 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 09 11:36:33 crc kubenswrapper[4745]: I1209 11:36:33.985386 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 09 11:36:34 crc kubenswrapper[4745]: I1209 11:36:34.321722 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 09 11:36:34 crc kubenswrapper[4745]: I1209 11:36:34.324929 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 09 11:36:34 crc kubenswrapper[4745]: I1209 11:36:34.427264 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 09 11:36:34 crc kubenswrapper[4745]: I1209 11:36:34.447482 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 09 11:36:34 crc kubenswrapper[4745]: I1209 11:36:34.675035 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 11:36:34 crc kubenswrapper[4745]: I1209 11:36:34.844989 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 09 11:36:34 crc kubenswrapper[4745]: I1209 11:36:34.897278 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 09 11:36:35 crc kubenswrapper[4745]: I1209 11:36:35.068558 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 11:36:35 crc kubenswrapper[4745]: I1209 11:36:35.376126 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 09 11:36:35 crc kubenswrapper[4745]: I1209 11:36:35.429677 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 09 11:36:35 crc kubenswrapper[4745]: I1209 11:36:35.513196 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 09 11:36:35 crc kubenswrapper[4745]: I1209 11:36:35.565623 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 09 11:36:35 crc kubenswrapper[4745]: I1209 11:36:35.577552 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 09 11:36:35 crc kubenswrapper[4745]: I1209 11:36:35.786384 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.022164 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.031469 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.206337 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.351635 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.723182 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.723601 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.797563 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.821293 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.821391 4745 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="40905afca81ef5df06ab4dd5547a4da1fac05889fa1f0d8d234afc59ce3cae1a" exitCode=137 Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.821484 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.821534 4745 scope.go:117] "RemoveContainer" containerID="40905afca81ef5df06ab4dd5547a4da1fac05889fa1f0d8d234afc59ce3cae1a" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.846050 4745 scope.go:117] "RemoveContainer" containerID="40905afca81ef5df06ab4dd5547a4da1fac05889fa1f0d8d234afc59ce3cae1a" Dec 09 11:36:36 crc kubenswrapper[4745]: E1209 11:36:36.846543 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40905afca81ef5df06ab4dd5547a4da1fac05889fa1f0d8d234afc59ce3cae1a\": container with ID starting with 40905afca81ef5df06ab4dd5547a4da1fac05889fa1f0d8d234afc59ce3cae1a not found: ID does not exist" containerID="40905afca81ef5df06ab4dd5547a4da1fac05889fa1f0d8d234afc59ce3cae1a" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.846591 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40905afca81ef5df06ab4dd5547a4da1fac05889fa1f0d8d234afc59ce3cae1a"} err="failed to get container status \"40905afca81ef5df06ab4dd5547a4da1fac05889fa1f0d8d234afc59ce3cae1a\": rpc error: code = NotFound desc = could not find container \"40905afca81ef5df06ab4dd5547a4da1fac05889fa1f0d8d234afc59ce3cae1a\": container with ID starting with 40905afca81ef5df06ab4dd5547a4da1fac05889fa1f0d8d234afc59ce3cae1a not found: ID does not exist" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.864641 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.904232 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.904286 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.904321 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.904338 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.904408 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.904401 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.904470 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.904544 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.904564 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.904753 4745 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.904780 4745 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.904792 4745 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.904800 4745 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.920777 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:36:36 crc kubenswrapper[4745]: I1209 11:36:36.920831 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 09 11:36:37 crc kubenswrapper[4745]: I1209 11:36:37.006488 4745 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:37 crc kubenswrapper[4745]: I1209 11:36:37.306477 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 09 11:36:37 crc kubenswrapper[4745]: I1209 11:36:37.562971 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 09 11:36:37 crc kubenswrapper[4745]: I1209 11:36:37.563635 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 09 11:36:37 crc kubenswrapper[4745]: I1209 11:36:37.578712 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 11:36:37 crc kubenswrapper[4745]: I1209 11:36:37.578752 4745 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="572a0cf6-0bb3-42c3-bf40-90ce4c07cba7" Dec 09 11:36:37 crc kubenswrapper[4745]: I1209 11:36:37.584601 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 11:36:37 crc kubenswrapper[4745]: I1209 11:36:37.584647 4745 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="572a0cf6-0bb3-42c3-bf40-90ce4c07cba7" Dec 09 11:36:37 crc kubenswrapper[4745]: I1209 11:36:37.895706 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 09 11:36:37 crc kubenswrapper[4745]: I1209 11:36:37.962122 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 09 11:36:38 crc kubenswrapper[4745]: I1209 11:36:38.130975 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 09 11:36:38 crc kubenswrapper[4745]: I1209 11:36:38.467038 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 09 11:36:39 crc kubenswrapper[4745]: I1209 11:36:39.385838 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 09 11:36:53 crc kubenswrapper[4745]: I1209 11:36:53.424245 4745 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.101756 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-j258s" podUID="3824312f-b03f-42da-890e-53a61841a8b0" containerName="oauth-openshift" containerID="cri-o://1ef0470226d272b89ae61d0ff568380a20f4fd7bc6c329d7aa20afbc7a0b12df" gracePeriod=15 Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.492809 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hj9bn"] Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.493109 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" podUID="fa5c85c2-26a5-44d3-a759-080eb6198c6d" containerName="controller-manager" containerID="cri-o://a873825b7250c4e62eb4f3f66d79ea3bc0fdac8ca4858102c1dafa09a58eda52" gracePeriod=30 Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.609186 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47"] Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.609564 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" podUID="6cfa061d-49b8-4640-ae8d-674ef0832ef7" containerName="route-controller-manager" containerID="cri-o://26e7757bf9f67a097fbd74cc86f6930cfeb91f98f762fbdfb3beeeb68f7a9a52" gracePeriod=30 Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.886884 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.937489 4745 generic.go:334] "Generic (PLEG): container finished" podID="3824312f-b03f-42da-890e-53a61841a8b0" containerID="1ef0470226d272b89ae61d0ff568380a20f4fd7bc6c329d7aa20afbc7a0b12df" exitCode=0 Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.937638 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-j258s" event={"ID":"3824312f-b03f-42da-890e-53a61841a8b0","Type":"ContainerDied","Data":"1ef0470226d272b89ae61d0ff568380a20f4fd7bc6c329d7aa20afbc7a0b12df"} Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.939172 4745 generic.go:334] "Generic (PLEG): container finished" podID="fa5c85c2-26a5-44d3-a759-080eb6198c6d" containerID="a873825b7250c4e62eb4f3f66d79ea3bc0fdac8ca4858102c1dafa09a58eda52" exitCode=0 Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.939307 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.939364 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" event={"ID":"fa5c85c2-26a5-44d3-a759-080eb6198c6d","Type":"ContainerDied","Data":"a873825b7250c4e62eb4f3f66d79ea3bc0fdac8ca4858102c1dafa09a58eda52"} Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.939452 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hj9bn" event={"ID":"fa5c85c2-26a5-44d3-a759-080eb6198c6d","Type":"ContainerDied","Data":"367a14a5405ff68733e157dfcb2a97d90f270c21f7d6a7a32bd7f6d0cdb957a8"} Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.939483 4745 scope.go:117] "RemoveContainer" containerID="a873825b7250c4e62eb4f3f66d79ea3bc0fdac8ca4858102c1dafa09a58eda52" Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.941331 4745 generic.go:334] "Generic (PLEG): container finished" podID="6cfa061d-49b8-4640-ae8d-674ef0832ef7" containerID="26e7757bf9f67a097fbd74cc86f6930cfeb91f98f762fbdfb3beeeb68f7a9a52" exitCode=0 Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.941359 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" event={"ID":"6cfa061d-49b8-4640-ae8d-674ef0832ef7","Type":"ContainerDied","Data":"26e7757bf9f67a097fbd74cc86f6930cfeb91f98f762fbdfb3beeeb68f7a9a52"} Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.971135 4745 scope.go:117] "RemoveContainer" containerID="a873825b7250c4e62eb4f3f66d79ea3bc0fdac8ca4858102c1dafa09a58eda52" Dec 09 11:36:55 crc kubenswrapper[4745]: E1209 11:36:55.971993 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a873825b7250c4e62eb4f3f66d79ea3bc0fdac8ca4858102c1dafa09a58eda52\": container with ID starting with a873825b7250c4e62eb4f3f66d79ea3bc0fdac8ca4858102c1dafa09a58eda52 not found: ID does not exist" containerID="a873825b7250c4e62eb4f3f66d79ea3bc0fdac8ca4858102c1dafa09a58eda52" Dec 09 11:36:55 crc kubenswrapper[4745]: I1209 11:36:55.972048 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a873825b7250c4e62eb4f3f66d79ea3bc0fdac8ca4858102c1dafa09a58eda52"} err="failed to get container status \"a873825b7250c4e62eb4f3f66d79ea3bc0fdac8ca4858102c1dafa09a58eda52\": rpc error: code = NotFound desc = could not find container \"a873825b7250c4e62eb4f3f66d79ea3bc0fdac8ca4858102c1dafa09a58eda52\": container with ID starting with a873825b7250c4e62eb4f3f66d79ea3bc0fdac8ca4858102c1dafa09a58eda52 not found: ID does not exist" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.048100 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.055059 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-proxy-ca-bundles\") pod \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.055168 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqvmq\" (UniqueName: \"kubernetes.io/projected/fa5c85c2-26a5-44d3-a759-080eb6198c6d-kube-api-access-pqvmq\") pod \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.055286 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-client-ca\") pod \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.055314 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-config\") pod \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.055346 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa5c85c2-26a5-44d3-a759-080eb6198c6d-serving-cert\") pod \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\" (UID: \"fa5c85c2-26a5-44d3-a759-080eb6198c6d\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.058438 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-config" (OuterVolumeSpecName: "config") pod "fa5c85c2-26a5-44d3-a759-080eb6198c6d" (UID: "fa5c85c2-26a5-44d3-a759-080eb6198c6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.058967 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fa5c85c2-26a5-44d3-a759-080eb6198c6d" (UID: "fa5c85c2-26a5-44d3-a759-080eb6198c6d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.059058 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-client-ca" (OuterVolumeSpecName: "client-ca") pod "fa5c85c2-26a5-44d3-a759-080eb6198c6d" (UID: "fa5c85c2-26a5-44d3-a759-080eb6198c6d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.067144 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa5c85c2-26a5-44d3-a759-080eb6198c6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fa5c85c2-26a5-44d3-a759-080eb6198c6d" (UID: "fa5c85c2-26a5-44d3-a759-080eb6198c6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.068355 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa5c85c2-26a5-44d3-a759-080eb6198c6d-kube-api-access-pqvmq" (OuterVolumeSpecName: "kube-api-access-pqvmq") pod "fa5c85c2-26a5-44d3-a759-080eb6198c6d" (UID: "fa5c85c2-26a5-44d3-a759-080eb6198c6d"). InnerVolumeSpecName "kube-api-access-pqvmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.078997 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.156888 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-cliconfig\") pod \"3824312f-b03f-42da-890e-53a61841a8b0\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157013 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-audit-policies\") pod \"3824312f-b03f-42da-890e-53a61841a8b0\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157070 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-session\") pod \"3824312f-b03f-42da-890e-53a61841a8b0\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157148 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3824312f-b03f-42da-890e-53a61841a8b0-audit-dir\") pod \"3824312f-b03f-42da-890e-53a61841a8b0\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157181 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-error\") pod \"3824312f-b03f-42da-890e-53a61841a8b0\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157238 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-trusted-ca-bundle\") pod \"3824312f-b03f-42da-890e-53a61841a8b0\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157279 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-idp-0-file-data\") pod \"3824312f-b03f-42da-890e-53a61841a8b0\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157316 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-ocp-branding-template\") pod \"3824312f-b03f-42da-890e-53a61841a8b0\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157345 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-login\") pod \"3824312f-b03f-42da-890e-53a61841a8b0\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157381 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-serving-cert\") pod \"3824312f-b03f-42da-890e-53a61841a8b0\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157422 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-service-ca\") pod \"3824312f-b03f-42da-890e-53a61841a8b0\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157455 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-provider-selection\") pod \"3824312f-b03f-42da-890e-53a61841a8b0\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157492 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-router-certs\") pod \"3824312f-b03f-42da-890e-53a61841a8b0\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157547 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j68kj\" (UniqueName: \"kubernetes.io/projected/3824312f-b03f-42da-890e-53a61841a8b0-kube-api-access-j68kj\") pod \"3824312f-b03f-42da-890e-53a61841a8b0\" (UID: \"3824312f-b03f-42da-890e-53a61841a8b0\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157858 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157878 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157893 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa5c85c2-26a5-44d3-a759-080eb6198c6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157906 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fa5c85c2-26a5-44d3-a759-080eb6198c6d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.157918 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqvmq\" (UniqueName: \"kubernetes.io/projected/fa5c85c2-26a5-44d3-a759-080eb6198c6d-kube-api-access-pqvmq\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.158074 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3824312f-b03f-42da-890e-53a61841a8b0" (UID: "3824312f-b03f-42da-890e-53a61841a8b0"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.159021 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3824312f-b03f-42da-890e-53a61841a8b0-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3824312f-b03f-42da-890e-53a61841a8b0" (UID: "3824312f-b03f-42da-890e-53a61841a8b0"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.159290 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3824312f-b03f-42da-890e-53a61841a8b0" (UID: "3824312f-b03f-42da-890e-53a61841a8b0"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.159330 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3824312f-b03f-42da-890e-53a61841a8b0" (UID: "3824312f-b03f-42da-890e-53a61841a8b0"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.159581 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3824312f-b03f-42da-890e-53a61841a8b0" (UID: "3824312f-b03f-42da-890e-53a61841a8b0"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.162039 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3824312f-b03f-42da-890e-53a61841a8b0" (UID: "3824312f-b03f-42da-890e-53a61841a8b0"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.162096 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3824312f-b03f-42da-890e-53a61841a8b0-kube-api-access-j68kj" (OuterVolumeSpecName: "kube-api-access-j68kj") pod "3824312f-b03f-42da-890e-53a61841a8b0" (UID: "3824312f-b03f-42da-890e-53a61841a8b0"). InnerVolumeSpecName "kube-api-access-j68kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.163760 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3824312f-b03f-42da-890e-53a61841a8b0" (UID: "3824312f-b03f-42da-890e-53a61841a8b0"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.163878 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "3824312f-b03f-42da-890e-53a61841a8b0" (UID: "3824312f-b03f-42da-890e-53a61841a8b0"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.164120 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3824312f-b03f-42da-890e-53a61841a8b0" (UID: "3824312f-b03f-42da-890e-53a61841a8b0"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.164313 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3824312f-b03f-42da-890e-53a61841a8b0" (UID: "3824312f-b03f-42da-890e-53a61841a8b0"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.164539 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3824312f-b03f-42da-890e-53a61841a8b0" (UID: "3824312f-b03f-42da-890e-53a61841a8b0"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.165855 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3824312f-b03f-42da-890e-53a61841a8b0" (UID: "3824312f-b03f-42da-890e-53a61841a8b0"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.166121 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3824312f-b03f-42da-890e-53a61841a8b0" (UID: "3824312f-b03f-42da-890e-53a61841a8b0"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.258763 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfa061d-49b8-4640-ae8d-674ef0832ef7-config\") pod \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\" (UID: \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.258831 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cfa061d-49b8-4640-ae8d-674ef0832ef7-serving-cert\") pod \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\" (UID: \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.258883 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cfa061d-49b8-4640-ae8d-674ef0832ef7-client-ca\") pod \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\" (UID: \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.258907 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwg9x\" (UniqueName: \"kubernetes.io/projected/6cfa061d-49b8-4640-ae8d-674ef0832ef7-kube-api-access-lwg9x\") pod \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\" (UID: \"6cfa061d-49b8-4640-ae8d-674ef0832ef7\") " Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.259345 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.259367 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.259382 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.259395 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.259410 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j68kj\" (UniqueName: \"kubernetes.io/projected/3824312f-b03f-42da-890e-53a61841a8b0-kube-api-access-j68kj\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.259426 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.259442 4745 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.259456 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.259471 4745 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3824312f-b03f-42da-890e-53a61841a8b0-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.259484 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.259502 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cfa061d-49b8-4640-ae8d-674ef0832ef7-client-ca" (OuterVolumeSpecName: "client-ca") pod "6cfa061d-49b8-4640-ae8d-674ef0832ef7" (UID: "6cfa061d-49b8-4640-ae8d-674ef0832ef7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.259564 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cfa061d-49b8-4640-ae8d-674ef0832ef7-config" (OuterVolumeSpecName: "config") pod "6cfa061d-49b8-4640-ae8d-674ef0832ef7" (UID: "6cfa061d-49b8-4640-ae8d-674ef0832ef7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.259500 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.259645 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.259852 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.259863 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3824312f-b03f-42da-890e-53a61841a8b0-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.263096 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cfa061d-49b8-4640-ae8d-674ef0832ef7-kube-api-access-lwg9x" (OuterVolumeSpecName: "kube-api-access-lwg9x") pod "6cfa061d-49b8-4640-ae8d-674ef0832ef7" (UID: "6cfa061d-49b8-4640-ae8d-674ef0832ef7"). InnerVolumeSpecName "kube-api-access-lwg9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.264143 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cfa061d-49b8-4640-ae8d-674ef0832ef7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6cfa061d-49b8-4640-ae8d-674ef0832ef7" (UID: "6cfa061d-49b8-4640-ae8d-674ef0832ef7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.279643 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hj9bn"] Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.283920 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hj9bn"] Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.360745 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cfa061d-49b8-4640-ae8d-674ef0832ef7-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.360820 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwg9x\" (UniqueName: \"kubernetes.io/projected/6cfa061d-49b8-4640-ae8d-674ef0832ef7-kube-api-access-lwg9x\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.360833 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfa061d-49b8-4640-ae8d-674ef0832ef7-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.360846 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cfa061d-49b8-4640-ae8d-674ef0832ef7-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.848441 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c77c95f7f-vv5cr"] Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.848876 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3824312f-b03f-42da-890e-53a61841a8b0" containerName="oauth-openshift" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.848900 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="3824312f-b03f-42da-890e-53a61841a8b0" containerName="oauth-openshift" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.848911 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1305324-76f0-474c-8933-599d9b6eaff4" containerName="extract-content" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.848923 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1305324-76f0-474c-8933-599d9b6eaff4" containerName="extract-content" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.848947 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a216af0b-0937-4319-97bc-9d180389b873" containerName="extract-content" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.848954 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="a216af0b-0937-4319-97bc-9d180389b873" containerName="extract-content" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.848963 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daac685e-ee53-47ff-b3c4-7c999567e5cb" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.848968 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="daac685e-ee53-47ff-b3c4-7c999567e5cb" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.848979 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cfa061d-49b8-4640-ae8d-674ef0832ef7" containerName="route-controller-manager" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.848986 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cfa061d-49b8-4640-ae8d-674ef0832ef7" containerName="route-controller-manager" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.848995 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" containerName="extract-utilities" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849002 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" containerName="extract-utilities" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849012 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1305324-76f0-474c-8933-599d9b6eaff4" containerName="extract-utilities" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849018 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1305324-76f0-474c-8933-599d9b6eaff4" containerName="extract-utilities" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849029 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" containerName="extract-utilities" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849035 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" containerName="extract-utilities" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849041 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849047 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849056 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849062 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849070 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" containerName="extract-content" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849076 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" containerName="extract-content" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849086 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1305324-76f0-474c-8933-599d9b6eaff4" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849092 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1305324-76f0-474c-8933-599d9b6eaff4" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849102 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849108 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849117 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" containerName="extract-content" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849123 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" containerName="extract-content" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849130 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" containerName="extract-content" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849136 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" containerName="extract-content" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849143 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daac685e-ee53-47ff-b3c4-7c999567e5cb" containerName="extract-utilities" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849149 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="daac685e-ee53-47ff-b3c4-7c999567e5cb" containerName="extract-utilities" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849157 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e36127f-1987-4067-bf22-38a1bc134721" containerName="extract-content" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849163 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e36127f-1987-4067-bf22-38a1bc134721" containerName="extract-content" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849169 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa5c85c2-26a5-44d3-a759-080eb6198c6d" containerName="controller-manager" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849177 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5c85c2-26a5-44d3-a759-080eb6198c6d" containerName="controller-manager" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849188 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" containerName="extract-content" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849193 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" containerName="extract-content" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849200 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849206 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849215 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5372cf2-1aac-4a33-96ae-f6e7e612195a" containerName="marketplace-operator" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849221 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5372cf2-1aac-4a33-96ae-f6e7e612195a" containerName="marketplace-operator" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849230 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e36127f-1987-4067-bf22-38a1bc134721" containerName="extract-utilities" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849236 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e36127f-1987-4067-bf22-38a1bc134721" containerName="extract-utilities" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849243 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daac685e-ee53-47ff-b3c4-7c999567e5cb" containerName="extract-content" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849249 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="daac685e-ee53-47ff-b3c4-7c999567e5cb" containerName="extract-content" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849258 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849264 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849270 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a216af0b-0937-4319-97bc-9d180389b873" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849275 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="a216af0b-0937-4319-97bc-9d180389b873" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849286 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" containerName="extract-utilities" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849292 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" containerName="extract-utilities" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849300 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a216af0b-0937-4319-97bc-9d180389b873" containerName="extract-utilities" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849306 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="a216af0b-0937-4319-97bc-9d180389b873" containerName="extract-utilities" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849313 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" containerName="extract-utilities" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849322 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" containerName="extract-utilities" Dec 09 11:36:56 crc kubenswrapper[4745]: E1209 11:36:56.849331 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e36127f-1987-4067-bf22-38a1bc134721" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849338 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e36127f-1987-4067-bf22-38a1bc134721" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849455 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="3824312f-b03f-42da-890e-53a61841a8b0" containerName="oauth-openshift" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849467 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849475 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="a216af0b-0937-4319-97bc-9d180389b873" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849484 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e36127f-1987-4067-bf22-38a1bc134721" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849493 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa5c85c2-26a5-44d3-a759-080eb6198c6d" containerName="controller-manager" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849502 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bf72de4-c628-4b28-9bfc-1d6a874a6575" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849530 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5372cf2-1aac-4a33-96ae-f6e7e612195a" containerName="marketplace-operator" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849539 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cfa061d-49b8-4640-ae8d-674ef0832ef7" containerName="route-controller-manager" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849549 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="daac685e-ee53-47ff-b3c4-7c999567e5cb" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849557 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2ab340c-642b-4f8d-bc1f-adebf5e79418" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849568 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1305324-76f0-474c-8933-599d9b6eaff4" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849575 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="81940df1-47e0-46fa-9215-5751defbf8c0" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.849583 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d47c68-bfd3-4a99-afbf-7fe4f1478f05" containerName="registry-server" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.850127 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.853479 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.859134 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.859259 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.859134 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.859584 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.859783 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.861666 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5"] Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.864878 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.881199 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.886969 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c77c95f7f-vv5cr"] Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.898594 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5"] Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.949733 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-j258s" event={"ID":"3824312f-b03f-42da-890e-53a61841a8b0","Type":"ContainerDied","Data":"73a3b538c73a95dcfd9c6e09f4433724b3baff82660b0e862a2346b0d2caaa4e"} Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.949820 4745 scope.go:117] "RemoveContainer" containerID="1ef0470226d272b89ae61d0ff568380a20f4fd7bc6c329d7aa20afbc7a0b12df" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.949774 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-j258s" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.956637 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" event={"ID":"6cfa061d-49b8-4640-ae8d-674ef0832ef7","Type":"ContainerDied","Data":"4bc2151f7fc44cac8584d90132475f6f75a8fafabd1bee4d597c7cdd5e038173"} Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.956706 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.970485 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-config\") pod \"route-controller-manager-749db4944c-5fkp5\" (UID: \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\") " pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.970599 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-client-ca\") pod \"route-controller-manager-749db4944c-5fkp5\" (UID: \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\") " pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.971418 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00edc0c-cf2c-404c-be6a-8fd7147d384d-serving-cert\") pod \"controller-manager-c77c95f7f-vv5cr\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.971478 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-config\") pod \"controller-manager-c77c95f7f-vv5cr\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.971614 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctq7d\" (UniqueName: \"kubernetes.io/projected/e00edc0c-cf2c-404c-be6a-8fd7147d384d-kube-api-access-ctq7d\") pod \"controller-manager-c77c95f7f-vv5cr\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.971657 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xt4q\" (UniqueName: \"kubernetes.io/projected/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-kube-api-access-6xt4q\") pod \"route-controller-manager-749db4944c-5fkp5\" (UID: \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\") " pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.971691 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-client-ca\") pod \"controller-manager-c77c95f7f-vv5cr\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.971714 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-proxy-ca-bundles\") pod \"controller-manager-c77c95f7f-vv5cr\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.971767 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-serving-cert\") pod \"route-controller-manager-749db4944c-5fkp5\" (UID: \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\") " pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.975438 4745 scope.go:117] "RemoveContainer" containerID="26e7757bf9f67a097fbd74cc86f6930cfeb91f98f762fbdfb3beeeb68f7a9a52" Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.989893 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-j258s"] Dec 09 11:36:56 crc kubenswrapper[4745]: I1209 11:36:56.994230 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-j258s"] Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.016551 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47"] Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.020328 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wjx47"] Dec 09 11:36:57 crc kubenswrapper[4745]: E1209 11:36:57.032281 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3824312f_b03f_42da_890e_53a61841a8b0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cfa061d_49b8_4640_ae8d_674ef0832ef7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3824312f_b03f_42da_890e_53a61841a8b0.slice/crio-73a3b538c73a95dcfd9c6e09f4433724b3baff82660b0e862a2346b0d2caaa4e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cfa061d_49b8_4640_ae8d_674ef0832ef7.slice/crio-4bc2151f7fc44cac8584d90132475f6f75a8fafabd1bee4d597c7cdd5e038173\": RecentStats: unable to find data in memory cache]" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.073246 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctq7d\" (UniqueName: \"kubernetes.io/projected/e00edc0c-cf2c-404c-be6a-8fd7147d384d-kube-api-access-ctq7d\") pod \"controller-manager-c77c95f7f-vv5cr\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.073313 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xt4q\" (UniqueName: \"kubernetes.io/projected/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-kube-api-access-6xt4q\") pod \"route-controller-manager-749db4944c-5fkp5\" (UID: \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\") " pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.073356 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-client-ca\") pod \"controller-manager-c77c95f7f-vv5cr\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.073384 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-proxy-ca-bundles\") pod \"controller-manager-c77c95f7f-vv5cr\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.074804 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-proxy-ca-bundles\") pod \"controller-manager-c77c95f7f-vv5cr\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.075425 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-serving-cert\") pod \"route-controller-manager-749db4944c-5fkp5\" (UID: \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\") " pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.075591 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-config\") pod \"route-controller-manager-749db4944c-5fkp5\" (UID: \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\") " pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.076578 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-client-ca\") pod \"route-controller-manager-749db4944c-5fkp5\" (UID: \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\") " pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.076723 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-config\") pod \"route-controller-manager-749db4944c-5fkp5\" (UID: \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\") " pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.076864 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00edc0c-cf2c-404c-be6a-8fd7147d384d-serving-cert\") pod \"controller-manager-c77c95f7f-vv5cr\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.077005 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-config\") pod \"controller-manager-c77c95f7f-vv5cr\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.077048 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-client-ca\") pod \"controller-manager-c77c95f7f-vv5cr\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.077692 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-client-ca\") pod \"route-controller-manager-749db4944c-5fkp5\" (UID: \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\") " pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.079620 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-config\") pod \"controller-manager-c77c95f7f-vv5cr\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.080931 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-serving-cert\") pod \"route-controller-manager-749db4944c-5fkp5\" (UID: \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\") " pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.081799 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00edc0c-cf2c-404c-be6a-8fd7147d384d-serving-cert\") pod \"controller-manager-c77c95f7f-vv5cr\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.091724 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xt4q\" (UniqueName: \"kubernetes.io/projected/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-kube-api-access-6xt4q\") pod \"route-controller-manager-749db4944c-5fkp5\" (UID: \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\") " pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.096707 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctq7d\" (UniqueName: \"kubernetes.io/projected/e00edc0c-cf2c-404c-be6a-8fd7147d384d-kube-api-access-ctq7d\") pod \"controller-manager-c77c95f7f-vv5cr\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.203291 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.216278 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.443381 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5"] Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.464483 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c77c95f7f-vv5cr"] Dec 09 11:36:57 crc kubenswrapper[4745]: W1209 11:36:57.473943 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode00edc0c_cf2c_404c_be6a_8fd7147d384d.slice/crio-6652242d80cc90fc52d48337faf983032a3d2394686122a336a2094c97e619b7 WatchSource:0}: Error finding container 6652242d80cc90fc52d48337faf983032a3d2394686122a336a2094c97e619b7: Status 404 returned error can't find the container with id 6652242d80cc90fc52d48337faf983032a3d2394686122a336a2094c97e619b7 Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.562609 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3824312f-b03f-42da-890e-53a61841a8b0" path="/var/lib/kubelet/pods/3824312f-b03f-42da-890e-53a61841a8b0/volumes" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.563178 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cfa061d-49b8-4640-ae8d-674ef0832ef7" path="/var/lib/kubelet/pods/6cfa061d-49b8-4640-ae8d-674ef0832ef7/volumes" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.563748 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa5c85c2-26a5-44d3-a759-080eb6198c6d" path="/var/lib/kubelet/pods/fa5c85c2-26a5-44d3-a759-080eb6198c6d/volumes" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.975634 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" event={"ID":"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6","Type":"ContainerStarted","Data":"4b24f98e6f11f7cf63061233ee92ed1fecaefd355fb9dd358c56d02ccdc759bb"} Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.975688 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" event={"ID":"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6","Type":"ContainerStarted","Data":"51570f67a0aa6ecf6c2ee4aa759278081b2368130ff24ffc8c178a9c103726f4"} Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.978277 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.980731 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" event={"ID":"e00edc0c-cf2c-404c-be6a-8fd7147d384d","Type":"ContainerStarted","Data":"09a99cf63e9b9e2d3c6786baa81a0f276a9b9ba31fc72d47148aa6517300d111"} Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.980770 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" event={"ID":"e00edc0c-cf2c-404c-be6a-8fd7147d384d","Type":"ContainerStarted","Data":"6652242d80cc90fc52d48337faf983032a3d2394686122a336a2094c97e619b7"} Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.981986 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.985408 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:36:57 crc kubenswrapper[4745]: I1209 11:36:57.993713 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" podStartSLOduration=2.993700468 podStartE2EDuration="2.993700468s" podCreationTimestamp="2025-12-09 11:36:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:36:57.991888939 +0000 UTC m=+304.817090463" watchObservedRunningTime="2025-12-09 11:36:57.993700468 +0000 UTC m=+304.818901992" Dec 09 11:36:58 crc kubenswrapper[4745]: I1209 11:36:58.011849 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" podStartSLOduration=3.011827951 podStartE2EDuration="3.011827951s" podCreationTimestamp="2025-12-09 11:36:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:36:58.010304799 +0000 UTC m=+304.835506323" watchObservedRunningTime="2025-12-09 11:36:58.011827951 +0000 UTC m=+304.837029475" Dec 09 11:36:58 crc kubenswrapper[4745]: I1209 11:36:58.058584 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:36:58 crc kubenswrapper[4745]: I1209 11:36:58.190553 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c77c95f7f-vv5cr"] Dec 09 11:36:58 crc kubenswrapper[4745]: I1209 11:36:58.239690 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5"] Dec 09 11:36:59 crc kubenswrapper[4745]: I1209 11:36:59.991378 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" podUID="e00edc0c-cf2c-404c-be6a-8fd7147d384d" containerName="controller-manager" containerID="cri-o://09a99cf63e9b9e2d3c6786baa81a0f276a9b9ba31fc72d47148aa6517300d111" gracePeriod=30 Dec 09 11:36:59 crc kubenswrapper[4745]: I1209 11:36:59.991501 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" podUID="d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6" containerName="route-controller-manager" containerID="cri-o://4b24f98e6f11f7cf63061233ee92ed1fecaefd355fb9dd358c56d02ccdc759bb" gracePeriod=30 Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.384532 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.419960 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8"] Dec 09 11:37:00 crc kubenswrapper[4745]: E1209 11:37:00.420302 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6" containerName="route-controller-manager" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.420320 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6" containerName="route-controller-manager" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.420453 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6" containerName="route-controller-manager" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.421145 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.430437 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-config\") pod \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\" (UID: \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\") " Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.430645 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xt4q\" (UniqueName: \"kubernetes.io/projected/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-kube-api-access-6xt4q\") pod \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\" (UID: \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\") " Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.430682 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-serving-cert\") pod \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\" (UID: \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\") " Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.430749 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-client-ca\") pod \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\" (UID: \"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6\") " Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.431007 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/697e0af4-fe0b-44e0-993d-367648ade37f-client-ca\") pod \"route-controller-manager-84864cfc78-9khs8\" (UID: \"697e0af4-fe0b-44e0-993d-367648ade37f\") " pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.431046 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9ftn\" (UniqueName: \"kubernetes.io/projected/697e0af4-fe0b-44e0-993d-367648ade37f-kube-api-access-b9ftn\") pod \"route-controller-manager-84864cfc78-9khs8\" (UID: \"697e0af4-fe0b-44e0-993d-367648ade37f\") " pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.431079 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/697e0af4-fe0b-44e0-993d-367648ade37f-serving-cert\") pod \"route-controller-manager-84864cfc78-9khs8\" (UID: \"697e0af4-fe0b-44e0-993d-367648ade37f\") " pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.431107 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/697e0af4-fe0b-44e0-993d-367648ade37f-config\") pod \"route-controller-manager-84864cfc78-9khs8\" (UID: \"697e0af4-fe0b-44e0-993d-367648ade37f\") " pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.431264 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-config" (OuterVolumeSpecName: "config") pod "d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6" (UID: "d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.431612 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-client-ca" (OuterVolumeSpecName: "client-ca") pod "d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6" (UID: "d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.434593 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8"] Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.437031 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-kube-api-access-6xt4q" (OuterVolumeSpecName: "kube-api-access-6xt4q") pod "d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6" (UID: "d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6"). InnerVolumeSpecName "kube-api-access-6xt4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.437043 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6" (UID: "d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.489576 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.531396 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-config\") pod \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.532169 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-proxy-ca-bundles\") pod \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.532216 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00edc0c-cf2c-404c-be6a-8fd7147d384d-serving-cert\") pod \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.532295 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctq7d\" (UniqueName: \"kubernetes.io/projected/e00edc0c-cf2c-404c-be6a-8fd7147d384d-kube-api-access-ctq7d\") pod \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.532323 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-client-ca\") pod \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\" (UID: \"e00edc0c-cf2c-404c-be6a-8fd7147d384d\") " Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.532448 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/697e0af4-fe0b-44e0-993d-367648ade37f-client-ca\") pod \"route-controller-manager-84864cfc78-9khs8\" (UID: \"697e0af4-fe0b-44e0-993d-367648ade37f\") " pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.532475 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9ftn\" (UniqueName: \"kubernetes.io/projected/697e0af4-fe0b-44e0-993d-367648ade37f-kube-api-access-b9ftn\") pod \"route-controller-manager-84864cfc78-9khs8\" (UID: \"697e0af4-fe0b-44e0-993d-367648ade37f\") " pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.532495 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/697e0af4-fe0b-44e0-993d-367648ade37f-serving-cert\") pod \"route-controller-manager-84864cfc78-9khs8\" (UID: \"697e0af4-fe0b-44e0-993d-367648ade37f\") " pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.532532 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/697e0af4-fe0b-44e0-993d-367648ade37f-config\") pod \"route-controller-manager-84864cfc78-9khs8\" (UID: \"697e0af4-fe0b-44e0-993d-367648ade37f\") " pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.532602 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.532618 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xt4q\" (UniqueName: \"kubernetes.io/projected/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-kube-api-access-6xt4q\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.532632 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.532643 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.533736 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/697e0af4-fe0b-44e0-993d-367648ade37f-config\") pod \"route-controller-manager-84864cfc78-9khs8\" (UID: \"697e0af4-fe0b-44e0-993d-367648ade37f\") " pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.532099 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-config" (OuterVolumeSpecName: "config") pod "e00edc0c-cf2c-404c-be6a-8fd7147d384d" (UID: "e00edc0c-cf2c-404c-be6a-8fd7147d384d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.535223 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/697e0af4-fe0b-44e0-993d-367648ade37f-client-ca\") pod \"route-controller-manager-84864cfc78-9khs8\" (UID: \"697e0af4-fe0b-44e0-993d-367648ade37f\") " pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.535275 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-client-ca" (OuterVolumeSpecName: "client-ca") pod "e00edc0c-cf2c-404c-be6a-8fd7147d384d" (UID: "e00edc0c-cf2c-404c-be6a-8fd7147d384d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.535593 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e00edc0c-cf2c-404c-be6a-8fd7147d384d" (UID: "e00edc0c-cf2c-404c-be6a-8fd7147d384d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.538312 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00edc0c-cf2c-404c-be6a-8fd7147d384d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e00edc0c-cf2c-404c-be6a-8fd7147d384d" (UID: "e00edc0c-cf2c-404c-be6a-8fd7147d384d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.538871 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e00edc0c-cf2c-404c-be6a-8fd7147d384d-kube-api-access-ctq7d" (OuterVolumeSpecName: "kube-api-access-ctq7d") pod "e00edc0c-cf2c-404c-be6a-8fd7147d384d" (UID: "e00edc0c-cf2c-404c-be6a-8fd7147d384d"). InnerVolumeSpecName "kube-api-access-ctq7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.538960 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/697e0af4-fe0b-44e0-993d-367648ade37f-serving-cert\") pod \"route-controller-manager-84864cfc78-9khs8\" (UID: \"697e0af4-fe0b-44e0-993d-367648ade37f\") " pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.552105 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9ftn\" (UniqueName: \"kubernetes.io/projected/697e0af4-fe0b-44e0-993d-367648ade37f-kube-api-access-b9ftn\") pod \"route-controller-manager-84864cfc78-9khs8\" (UID: \"697e0af4-fe0b-44e0-993d-367648ade37f\") " pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.634343 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.634386 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.634397 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e00edc0c-cf2c-404c-be6a-8fd7147d384d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.634408 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00edc0c-cf2c-404c-be6a-8fd7147d384d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.634418 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctq7d\" (UniqueName: \"kubernetes.io/projected/e00edc0c-cf2c-404c-be6a-8fd7147d384d-kube-api-access-ctq7d\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.782740 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:00 crc kubenswrapper[4745]: I1209 11:37:00.999225 4745 generic.go:334] "Generic (PLEG): container finished" podID="e00edc0c-cf2c-404c-be6a-8fd7147d384d" containerID="09a99cf63e9b9e2d3c6786baa81a0f276a9b9ba31fc72d47148aa6517300d111" exitCode=0 Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:00.999428 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" event={"ID":"e00edc0c-cf2c-404c-be6a-8fd7147d384d","Type":"ContainerDied","Data":"09a99cf63e9b9e2d3c6786baa81a0f276a9b9ba31fc72d47148aa6517300d111"} Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:00.999834 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" event={"ID":"e00edc0c-cf2c-404c-be6a-8fd7147d384d","Type":"ContainerDied","Data":"6652242d80cc90fc52d48337faf983032a3d2394686122a336a2094c97e619b7"} Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:00.999865 4745 scope.go:117] "RemoveContainer" containerID="09a99cf63e9b9e2d3c6786baa81a0f276a9b9ba31fc72d47148aa6517300d111" Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:00.999580 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c77c95f7f-vv5cr" Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:01.005749 4745 generic.go:334] "Generic (PLEG): container finished" podID="d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6" containerID="4b24f98e6f11f7cf63061233ee92ed1fecaefd355fb9dd358c56d02ccdc759bb" exitCode=0 Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:01.005785 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:01.005804 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" event={"ID":"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6","Type":"ContainerDied","Data":"4b24f98e6f11f7cf63061233ee92ed1fecaefd355fb9dd358c56d02ccdc759bb"} Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:01.005839 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5" event={"ID":"d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6","Type":"ContainerDied","Data":"51570f67a0aa6ecf6c2ee4aa759278081b2368130ff24ffc8c178a9c103726f4"} Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:01.018368 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8"] Dec 09 11:37:01 crc kubenswrapper[4745]: W1209 11:37:01.023165 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod697e0af4_fe0b_44e0_993d_367648ade37f.slice/crio-082bdceb6dabbd1606dd5fd15f4c8b83ca9fb5993b813d61f94d94fde0fa3a52 WatchSource:0}: Error finding container 082bdceb6dabbd1606dd5fd15f4c8b83ca9fb5993b813d61f94d94fde0fa3a52: Status 404 returned error can't find the container with id 082bdceb6dabbd1606dd5fd15f4c8b83ca9fb5993b813d61f94d94fde0fa3a52 Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:01.033313 4745 scope.go:117] "RemoveContainer" containerID="09a99cf63e9b9e2d3c6786baa81a0f276a9b9ba31fc72d47148aa6517300d111" Dec 09 11:37:01 crc kubenswrapper[4745]: E1209 11:37:01.034238 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09a99cf63e9b9e2d3c6786baa81a0f276a9b9ba31fc72d47148aa6517300d111\": container with ID starting with 09a99cf63e9b9e2d3c6786baa81a0f276a9b9ba31fc72d47148aa6517300d111 not found: ID does not exist" containerID="09a99cf63e9b9e2d3c6786baa81a0f276a9b9ba31fc72d47148aa6517300d111" Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:01.034284 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09a99cf63e9b9e2d3c6786baa81a0f276a9b9ba31fc72d47148aa6517300d111"} err="failed to get container status \"09a99cf63e9b9e2d3c6786baa81a0f276a9b9ba31fc72d47148aa6517300d111\": rpc error: code = NotFound desc = could not find container \"09a99cf63e9b9e2d3c6786baa81a0f276a9b9ba31fc72d47148aa6517300d111\": container with ID starting with 09a99cf63e9b9e2d3c6786baa81a0f276a9b9ba31fc72d47148aa6517300d111 not found: ID does not exist" Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:01.034319 4745 scope.go:117] "RemoveContainer" containerID="4b24f98e6f11f7cf63061233ee92ed1fecaefd355fb9dd358c56d02ccdc759bb" Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:01.050217 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c77c95f7f-vv5cr"] Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:01.056842 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-c77c95f7f-vv5cr"] Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:01.065108 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5"] Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:01.065844 4745 scope.go:117] "RemoveContainer" containerID="4b24f98e6f11f7cf63061233ee92ed1fecaefd355fb9dd358c56d02ccdc759bb" Dec 09 11:37:01 crc kubenswrapper[4745]: E1209 11:37:01.066485 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b24f98e6f11f7cf63061233ee92ed1fecaefd355fb9dd358c56d02ccdc759bb\": container with ID starting with 4b24f98e6f11f7cf63061233ee92ed1fecaefd355fb9dd358c56d02ccdc759bb not found: ID does not exist" containerID="4b24f98e6f11f7cf63061233ee92ed1fecaefd355fb9dd358c56d02ccdc759bb" Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:01.066565 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b24f98e6f11f7cf63061233ee92ed1fecaefd355fb9dd358c56d02ccdc759bb"} err="failed to get container status \"4b24f98e6f11f7cf63061233ee92ed1fecaefd355fb9dd358c56d02ccdc759bb\": rpc error: code = NotFound desc = could not find container \"4b24f98e6f11f7cf63061233ee92ed1fecaefd355fb9dd358c56d02ccdc759bb\": container with ID starting with 4b24f98e6f11f7cf63061233ee92ed1fecaefd355fb9dd358c56d02ccdc759bb not found: ID does not exist" Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:01.069062 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-749db4944c-5fkp5"] Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:01.561900 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6" path="/var/lib/kubelet/pods/d1a8c2a8-b0c7-4cee-aeb9-e93ca5d0a9c6/volumes" Dec 09 11:37:01 crc kubenswrapper[4745]: I1209 11:37:01.562648 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e00edc0c-cf2c-404c-be6a-8fd7147d384d" path="/var/lib/kubelet/pods/e00edc0c-cf2c-404c-be6a-8fd7147d384d/volumes" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.013948 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" event={"ID":"697e0af4-fe0b-44e0-993d-367648ade37f","Type":"ContainerStarted","Data":"b84c206daee265dbbef3917cc246f9d8a86a9ce44ac7c65b561506ee3b6132ec"} Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.014369 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.014387 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" event={"ID":"697e0af4-fe0b-44e0-993d-367648ade37f","Type":"ContainerStarted","Data":"082bdceb6dabbd1606dd5fd15f4c8b83ca9fb5993b813d61f94d94fde0fa3a52"} Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.019457 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.062072 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" podStartSLOduration=4.062047945 podStartE2EDuration="4.062047945s" podCreationTimestamp="2025-12-09 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:37:02.04310684 +0000 UTC m=+308.868308364" watchObservedRunningTime="2025-12-09 11:37:02.062047945 +0000 UTC m=+308.887249469" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.851602 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74976c9c8b-bj9b7"] Dec 09 11:37:02 crc kubenswrapper[4745]: E1209 11:37:02.851997 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00edc0c-cf2c-404c-be6a-8fd7147d384d" containerName="controller-manager" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.852015 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00edc0c-cf2c-404c-be6a-8fd7147d384d" containerName="controller-manager" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.852139 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00edc0c-cf2c-404c-be6a-8fd7147d384d" containerName="controller-manager" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.852756 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.855162 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.855321 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.855782 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.856345 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.860476 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.862247 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqgtp\" (UniqueName: \"kubernetes.io/projected/7ea6c0ed-c8b0-47d1-8511-cc660074057e-kube-api-access-kqgtp\") pod \"controller-manager-74976c9c8b-bj9b7\" (UID: \"7ea6c0ed-c8b0-47d1-8511-cc660074057e\") " pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.862396 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ea6c0ed-c8b0-47d1-8511-cc660074057e-proxy-ca-bundles\") pod \"controller-manager-74976c9c8b-bj9b7\" (UID: \"7ea6c0ed-c8b0-47d1-8511-cc660074057e\") " pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.862520 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ea6c0ed-c8b0-47d1-8511-cc660074057e-client-ca\") pod \"controller-manager-74976c9c8b-bj9b7\" (UID: \"7ea6c0ed-c8b0-47d1-8511-cc660074057e\") " pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.862623 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ea6c0ed-c8b0-47d1-8511-cc660074057e-serving-cert\") pod \"controller-manager-74976c9c8b-bj9b7\" (UID: \"7ea6c0ed-c8b0-47d1-8511-cc660074057e\") " pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.862724 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ea6c0ed-c8b0-47d1-8511-cc660074057e-config\") pod \"controller-manager-74976c9c8b-bj9b7\" (UID: \"7ea6c0ed-c8b0-47d1-8511-cc660074057e\") " pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.863776 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.874245 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74976c9c8b-bj9b7"] Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.875602 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.963292 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqgtp\" (UniqueName: \"kubernetes.io/projected/7ea6c0ed-c8b0-47d1-8511-cc660074057e-kube-api-access-kqgtp\") pod \"controller-manager-74976c9c8b-bj9b7\" (UID: \"7ea6c0ed-c8b0-47d1-8511-cc660074057e\") " pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.963360 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ea6c0ed-c8b0-47d1-8511-cc660074057e-proxy-ca-bundles\") pod \"controller-manager-74976c9c8b-bj9b7\" (UID: \"7ea6c0ed-c8b0-47d1-8511-cc660074057e\") " pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.963388 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ea6c0ed-c8b0-47d1-8511-cc660074057e-client-ca\") pod \"controller-manager-74976c9c8b-bj9b7\" (UID: \"7ea6c0ed-c8b0-47d1-8511-cc660074057e\") " pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.963407 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ea6c0ed-c8b0-47d1-8511-cc660074057e-serving-cert\") pod \"controller-manager-74976c9c8b-bj9b7\" (UID: \"7ea6c0ed-c8b0-47d1-8511-cc660074057e\") " pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.963452 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ea6c0ed-c8b0-47d1-8511-cc660074057e-config\") pod \"controller-manager-74976c9c8b-bj9b7\" (UID: \"7ea6c0ed-c8b0-47d1-8511-cc660074057e\") " pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.965089 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ea6c0ed-c8b0-47d1-8511-cc660074057e-client-ca\") pod \"controller-manager-74976c9c8b-bj9b7\" (UID: \"7ea6c0ed-c8b0-47d1-8511-cc660074057e\") " pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.965095 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ea6c0ed-c8b0-47d1-8511-cc660074057e-proxy-ca-bundles\") pod \"controller-manager-74976c9c8b-bj9b7\" (UID: \"7ea6c0ed-c8b0-47d1-8511-cc660074057e\") " pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.965230 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ea6c0ed-c8b0-47d1-8511-cc660074057e-config\") pod \"controller-manager-74976c9c8b-bj9b7\" (UID: \"7ea6c0ed-c8b0-47d1-8511-cc660074057e\") " pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.979274 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ea6c0ed-c8b0-47d1-8511-cc660074057e-serving-cert\") pod \"controller-manager-74976c9c8b-bj9b7\" (UID: \"7ea6c0ed-c8b0-47d1-8511-cc660074057e\") " pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:02 crc kubenswrapper[4745]: I1209 11:37:02.988697 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqgtp\" (UniqueName: \"kubernetes.io/projected/7ea6c0ed-c8b0-47d1-8511-cc660074057e-kube-api-access-kqgtp\") pod \"controller-manager-74976c9c8b-bj9b7\" (UID: \"7ea6c0ed-c8b0-47d1-8511-cc660074057e\") " pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:03 crc kubenswrapper[4745]: I1209 11:37:03.169703 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:03 crc kubenswrapper[4745]: I1209 11:37:03.376734 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74976c9c8b-bj9b7"] Dec 09 11:37:04 crc kubenswrapper[4745]: I1209 11:37:04.031962 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" event={"ID":"7ea6c0ed-c8b0-47d1-8511-cc660074057e","Type":"ContainerStarted","Data":"cb48537867105c33618321ad7e1437dafb656c48ee32f427c45c6dff29c19893"} Dec 09 11:37:04 crc kubenswrapper[4745]: I1209 11:37:04.032462 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" event={"ID":"7ea6c0ed-c8b0-47d1-8511-cc660074057e","Type":"ContainerStarted","Data":"50381fab219632469af4ee9ef7c3d458f657d0607273e0dfff4b7687593f84f0"} Dec 09 11:37:04 crc kubenswrapper[4745]: I1209 11:37:04.032500 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:04 crc kubenswrapper[4745]: I1209 11:37:04.049634 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" Dec 09 11:37:04 crc kubenswrapper[4745]: I1209 11:37:04.058404 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74976c9c8b-bj9b7" podStartSLOduration=6.058385546 podStartE2EDuration="6.058385546s" podCreationTimestamp="2025-12-09 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:37:04.055467647 +0000 UTC m=+310.880669171" watchObservedRunningTime="2025-12-09 11:37:04.058385546 +0000 UTC m=+310.883587060" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.859637 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-bb4799f5f-m44dw"] Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.860342 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.868148 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.868541 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.868681 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.868824 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.868953 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.869620 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.869913 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.870071 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.870097 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.870310 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.870727 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.870767 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.884909 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.896287 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.896922 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.922707 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx6rz\" (UniqueName: \"kubernetes.io/projected/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-kube-api-access-kx6rz\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.922759 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.922787 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.922803 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-session\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.922828 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-audit-dir\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.922906 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-service-ca\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.922949 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-user-template-error\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.922980 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.923001 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.923043 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.923345 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-user-template-login\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.923495 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-router-certs\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.923568 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.923741 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-audit-policies\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:06 crc kubenswrapper[4745]: I1209 11:37:06.964460 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-bb4799f5f-m44dw"] Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.025612 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.025698 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-user-template-login\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.025729 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-router-certs\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.025760 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.025813 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-audit-policies\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.025853 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6rz\" (UniqueName: \"kubernetes.io/projected/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-kube-api-access-kx6rz\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.025878 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.025911 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.025934 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-session\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.025970 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-audit-dir\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.025998 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-service-ca\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.026031 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-user-template-error\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.026056 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.026081 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.026835 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-audit-dir\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.027749 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-audit-policies\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.027776 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.028382 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.028397 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-service-ca\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.034393 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-session\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.034402 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-router-certs\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.034676 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-user-template-error\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.041184 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.041375 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.046444 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-user-template-login\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.052548 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.054741 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.066798 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx6rz\" (UniqueName: \"kubernetes.io/projected/1f404191-9569-4dd7-a5a6-624d8b7bbdf2-kube-api-access-kx6rz\") pod \"oauth-openshift-bb4799f5f-m44dw\" (UID: \"1f404191-9569-4dd7-a5a6-624d8b7bbdf2\") " pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.162239 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mfd2g"] Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.163207 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.175822 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mfd2g"] Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.180543 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.345243 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b806ac2f-c440-446b-8485-db7139b4ae01-bound-sa-token\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.345808 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b806ac2f-c440-446b-8485-db7139b4ae01-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.345851 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b806ac2f-c440-446b-8485-db7139b4ae01-trusted-ca\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.345890 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b806ac2f-c440-446b-8485-db7139b4ae01-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.345917 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgr9h\" (UniqueName: \"kubernetes.io/projected/b806ac2f-c440-446b-8485-db7139b4ae01-kube-api-access-tgr9h\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.345971 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.346006 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b806ac2f-c440-446b-8485-db7139b4ae01-registry-tls\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.346039 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b806ac2f-c440-446b-8485-db7139b4ae01-registry-certificates\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.381856 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.448171 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b806ac2f-c440-446b-8485-db7139b4ae01-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.448251 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgr9h\" (UniqueName: \"kubernetes.io/projected/b806ac2f-c440-446b-8485-db7139b4ae01-kube-api-access-tgr9h\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.448347 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b806ac2f-c440-446b-8485-db7139b4ae01-registry-tls\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.448384 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b806ac2f-c440-446b-8485-db7139b4ae01-registry-certificates\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.448453 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b806ac2f-c440-446b-8485-db7139b4ae01-bound-sa-token\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.448490 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b806ac2f-c440-446b-8485-db7139b4ae01-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.448557 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b806ac2f-c440-446b-8485-db7139b4ae01-trusted-ca\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.449642 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b806ac2f-c440-446b-8485-db7139b4ae01-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.450349 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b806ac2f-c440-446b-8485-db7139b4ae01-trusted-ca\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.450629 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b806ac2f-c440-446b-8485-db7139b4ae01-registry-certificates\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.457922 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b806ac2f-c440-446b-8485-db7139b4ae01-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.457931 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b806ac2f-c440-446b-8485-db7139b4ae01-registry-tls\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.471073 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b806ac2f-c440-446b-8485-db7139b4ae01-bound-sa-token\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.473437 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgr9h\" (UniqueName: \"kubernetes.io/projected/b806ac2f-c440-446b-8485-db7139b4ae01-kube-api-access-tgr9h\") pod \"image-registry-66df7c8f76-mfd2g\" (UID: \"b806ac2f-c440-446b-8485-db7139b4ae01\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.500138 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.634626 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-bb4799f5f-m44dw"] Dec 09 11:37:07 crc kubenswrapper[4745]: I1209 11:37:07.933186 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mfd2g"] Dec 09 11:37:07 crc kubenswrapper[4745]: W1209 11:37:07.944972 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb806ac2f_c440_446b_8485_db7139b4ae01.slice/crio-261186b611237c4092ea2dab70fa3427952a1bf775f6d72254f6da9c7c462cfc WatchSource:0}: Error finding container 261186b611237c4092ea2dab70fa3427952a1bf775f6d72254f6da9c7c462cfc: Status 404 returned error can't find the container with id 261186b611237c4092ea2dab70fa3427952a1bf775f6d72254f6da9c7c462cfc Dec 09 11:37:08 crc kubenswrapper[4745]: I1209 11:37:08.062110 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" event={"ID":"1f404191-9569-4dd7-a5a6-624d8b7bbdf2","Type":"ContainerStarted","Data":"9836083d0af33d15a600b86bfcccd219069160348cf84993f23be64033e649a5"} Dec 09 11:37:08 crc kubenswrapper[4745]: I1209 11:37:08.062182 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" event={"ID":"1f404191-9569-4dd7-a5a6-624d8b7bbdf2","Type":"ContainerStarted","Data":"43f44a2e3f8c9ab8dcc87048e13db49def64653665cd246694b2092581503093"} Dec 09 11:37:08 crc kubenswrapper[4745]: I1209 11:37:08.072813 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" event={"ID":"b806ac2f-c440-446b-8485-db7139b4ae01","Type":"ContainerStarted","Data":"261186b611237c4092ea2dab70fa3427952a1bf775f6d72254f6da9c7c462cfc"} Dec 09 11:37:08 crc kubenswrapper[4745]: I1209 11:37:08.090589 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" podStartSLOduration=39.090560981 podStartE2EDuration="39.090560981s" podCreationTimestamp="2025-12-09 11:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:37:08.088877365 +0000 UTC m=+314.914078889" watchObservedRunningTime="2025-12-09 11:37:08.090560981 +0000 UTC m=+314.915762525" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.096851 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" event={"ID":"b806ac2f-c440-446b-8485-db7139b4ae01","Type":"ContainerStarted","Data":"cfe75f083ba6be1c60cfdfa59a8cef9561ba008135a87d4df0cb230c364a6ff3"} Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.097731 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.105225 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-bb4799f5f-m44dw" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.126281 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" podStartSLOduration=2.126255384 podStartE2EDuration="2.126255384s" podCreationTimestamp="2025-12-09 11:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:37:09.119735307 +0000 UTC m=+315.944936851" watchObservedRunningTime="2025-12-09 11:37:09.126255384 +0000 UTC m=+315.951456908" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.339083 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n655r"] Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.341614 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n655r" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.345494 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.353100 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n655r"] Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.485366 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc9c8707-e572-487a-bc95-08a093771e39-catalog-content\") pod \"certified-operators-n655r\" (UID: \"bc9c8707-e572-487a-bc95-08a093771e39\") " pod="openshift-marketplace/certified-operators-n655r" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.485439 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc9c8707-e572-487a-bc95-08a093771e39-utilities\") pod \"certified-operators-n655r\" (UID: \"bc9c8707-e572-487a-bc95-08a093771e39\") " pod="openshift-marketplace/certified-operators-n655r" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.485564 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdqmk\" (UniqueName: \"kubernetes.io/projected/bc9c8707-e572-487a-bc95-08a093771e39-kube-api-access-kdqmk\") pod \"certified-operators-n655r\" (UID: \"bc9c8707-e572-487a-bc95-08a093771e39\") " pod="openshift-marketplace/certified-operators-n655r" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.534980 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-prw8f"] Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.536940 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-prw8f" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.540193 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.548552 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-prw8f"] Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.587071 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdqmk\" (UniqueName: \"kubernetes.io/projected/bc9c8707-e572-487a-bc95-08a093771e39-kube-api-access-kdqmk\") pod \"certified-operators-n655r\" (UID: \"bc9c8707-e572-487a-bc95-08a093771e39\") " pod="openshift-marketplace/certified-operators-n655r" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.587141 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc9c8707-e572-487a-bc95-08a093771e39-catalog-content\") pod \"certified-operators-n655r\" (UID: \"bc9c8707-e572-487a-bc95-08a093771e39\") " pod="openshift-marketplace/certified-operators-n655r" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.587175 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc9c8707-e572-487a-bc95-08a093771e39-utilities\") pod \"certified-operators-n655r\" (UID: \"bc9c8707-e572-487a-bc95-08a093771e39\") " pod="openshift-marketplace/certified-operators-n655r" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.587660 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc9c8707-e572-487a-bc95-08a093771e39-utilities\") pod \"certified-operators-n655r\" (UID: \"bc9c8707-e572-487a-bc95-08a093771e39\") " pod="openshift-marketplace/certified-operators-n655r" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.588065 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc9c8707-e572-487a-bc95-08a093771e39-catalog-content\") pod \"certified-operators-n655r\" (UID: \"bc9c8707-e572-487a-bc95-08a093771e39\") " pod="openshift-marketplace/certified-operators-n655r" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.608094 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdqmk\" (UniqueName: \"kubernetes.io/projected/bc9c8707-e572-487a-bc95-08a093771e39-kube-api-access-kdqmk\") pod \"certified-operators-n655r\" (UID: \"bc9c8707-e572-487a-bc95-08a093771e39\") " pod="openshift-marketplace/certified-operators-n655r" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.689112 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f725dd4f-d47f-4727-8722-88e91fe593b9-catalog-content\") pod \"community-operators-prw8f\" (UID: \"f725dd4f-d47f-4727-8722-88e91fe593b9\") " pod="openshift-marketplace/community-operators-prw8f" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.689635 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rfps\" (UniqueName: \"kubernetes.io/projected/f725dd4f-d47f-4727-8722-88e91fe593b9-kube-api-access-4rfps\") pod \"community-operators-prw8f\" (UID: \"f725dd4f-d47f-4727-8722-88e91fe593b9\") " pod="openshift-marketplace/community-operators-prw8f" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.689731 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f725dd4f-d47f-4727-8722-88e91fe593b9-utilities\") pod \"community-operators-prw8f\" (UID: \"f725dd4f-d47f-4727-8722-88e91fe593b9\") " pod="openshift-marketplace/community-operators-prw8f" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.726026 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n655r" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.791387 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f725dd4f-d47f-4727-8722-88e91fe593b9-catalog-content\") pod \"community-operators-prw8f\" (UID: \"f725dd4f-d47f-4727-8722-88e91fe593b9\") " pod="openshift-marketplace/community-operators-prw8f" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.791448 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rfps\" (UniqueName: \"kubernetes.io/projected/f725dd4f-d47f-4727-8722-88e91fe593b9-kube-api-access-4rfps\") pod \"community-operators-prw8f\" (UID: \"f725dd4f-d47f-4727-8722-88e91fe593b9\") " pod="openshift-marketplace/community-operators-prw8f" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.791476 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f725dd4f-d47f-4727-8722-88e91fe593b9-utilities\") pod \"community-operators-prw8f\" (UID: \"f725dd4f-d47f-4727-8722-88e91fe593b9\") " pod="openshift-marketplace/community-operators-prw8f" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.792235 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f725dd4f-d47f-4727-8722-88e91fe593b9-utilities\") pod \"community-operators-prw8f\" (UID: \"f725dd4f-d47f-4727-8722-88e91fe593b9\") " pod="openshift-marketplace/community-operators-prw8f" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.792391 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f725dd4f-d47f-4727-8722-88e91fe593b9-catalog-content\") pod \"community-operators-prw8f\" (UID: \"f725dd4f-d47f-4727-8722-88e91fe593b9\") " pod="openshift-marketplace/community-operators-prw8f" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.812887 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rfps\" (UniqueName: \"kubernetes.io/projected/f725dd4f-d47f-4727-8722-88e91fe593b9-kube-api-access-4rfps\") pod \"community-operators-prw8f\" (UID: \"f725dd4f-d47f-4727-8722-88e91fe593b9\") " pod="openshift-marketplace/community-operators-prw8f" Dec 09 11:37:09 crc kubenswrapper[4745]: I1209 11:37:09.857108 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-prw8f" Dec 09 11:37:10 crc kubenswrapper[4745]: I1209 11:37:10.102563 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:10 crc kubenswrapper[4745]: I1209 11:37:10.188373 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n655r"] Dec 09 11:37:10 crc kubenswrapper[4745]: W1209 11:37:10.195390 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc9c8707_e572_487a_bc95_08a093771e39.slice/crio-7ae87b054ef38c0d2ac3cf7ba049faf0976e457af1e4f615d370eeb55c05d759 WatchSource:0}: Error finding container 7ae87b054ef38c0d2ac3cf7ba049faf0976e457af1e4f615d370eeb55c05d759: Status 404 returned error can't find the container with id 7ae87b054ef38c0d2ac3cf7ba049faf0976e457af1e4f615d370eeb55c05d759 Dec 09 11:37:10 crc kubenswrapper[4745]: I1209 11:37:10.284547 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-prw8f"] Dec 09 11:37:10 crc kubenswrapper[4745]: W1209 11:37:10.294973 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf725dd4f_d47f_4727_8722_88e91fe593b9.slice/crio-31c8ef31d78e85ae54750db800144f1c4883a0ff283f411e7791b7c97010b58c WatchSource:0}: Error finding container 31c8ef31d78e85ae54750db800144f1c4883a0ff283f411e7791b7c97010b58c: Status 404 returned error can't find the container with id 31c8ef31d78e85ae54750db800144f1c4883a0ff283f411e7791b7c97010b58c Dec 09 11:37:11 crc kubenswrapper[4745]: I1209 11:37:11.112621 4745 generic.go:334] "Generic (PLEG): container finished" podID="f725dd4f-d47f-4727-8722-88e91fe593b9" containerID="ecfc50ead17bf75e808e1be6769a0c67b149b9b2a3e0b24827c473d52ba8f4ab" exitCode=0 Dec 09 11:37:11 crc kubenswrapper[4745]: I1209 11:37:11.112743 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-prw8f" event={"ID":"f725dd4f-d47f-4727-8722-88e91fe593b9","Type":"ContainerDied","Data":"ecfc50ead17bf75e808e1be6769a0c67b149b9b2a3e0b24827c473d52ba8f4ab"} Dec 09 11:37:11 crc kubenswrapper[4745]: I1209 11:37:11.112795 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-prw8f" event={"ID":"f725dd4f-d47f-4727-8722-88e91fe593b9","Type":"ContainerStarted","Data":"31c8ef31d78e85ae54750db800144f1c4883a0ff283f411e7791b7c97010b58c"} Dec 09 11:37:11 crc kubenswrapper[4745]: I1209 11:37:11.115275 4745 generic.go:334] "Generic (PLEG): container finished" podID="bc9c8707-e572-487a-bc95-08a093771e39" containerID="637b46ca9da3686875f902708c21c2da5f90b2a1ad5a01d2de389b7d33eb606b" exitCode=0 Dec 09 11:37:11 crc kubenswrapper[4745]: I1209 11:37:11.115477 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n655r" event={"ID":"bc9c8707-e572-487a-bc95-08a093771e39","Type":"ContainerDied","Data":"637b46ca9da3686875f902708c21c2da5f90b2a1ad5a01d2de389b7d33eb606b"} Dec 09 11:37:11 crc kubenswrapper[4745]: I1209 11:37:11.115592 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n655r" event={"ID":"bc9c8707-e572-487a-bc95-08a093771e39","Type":"ContainerStarted","Data":"7ae87b054ef38c0d2ac3cf7ba049faf0976e457af1e4f615d370eeb55c05d759"} Dec 09 11:37:11 crc kubenswrapper[4745]: I1209 11:37:11.933288 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4zp77"] Dec 09 11:37:11 crc kubenswrapper[4745]: I1209 11:37:11.934805 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zp77" Dec 09 11:37:11 crc kubenswrapper[4745]: I1209 11:37:11.936883 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 09 11:37:11 crc kubenswrapper[4745]: I1209 11:37:11.945757 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zp77"] Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.028431 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b77426b6-744b-4410-adb5-006a49cf8f1d-utilities\") pod \"redhat-marketplace-4zp77\" (UID: \"b77426b6-744b-4410-adb5-006a49cf8f1d\") " pod="openshift-marketplace/redhat-marketplace-4zp77" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.028488 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5664\" (UniqueName: \"kubernetes.io/projected/b77426b6-744b-4410-adb5-006a49cf8f1d-kube-api-access-r5664\") pod \"redhat-marketplace-4zp77\" (UID: \"b77426b6-744b-4410-adb5-006a49cf8f1d\") " pod="openshift-marketplace/redhat-marketplace-4zp77" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.028536 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b77426b6-744b-4410-adb5-006a49cf8f1d-catalog-content\") pod \"redhat-marketplace-4zp77\" (UID: \"b77426b6-744b-4410-adb5-006a49cf8f1d\") " pod="openshift-marketplace/redhat-marketplace-4zp77" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.129442 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b77426b6-744b-4410-adb5-006a49cf8f1d-utilities\") pod \"redhat-marketplace-4zp77\" (UID: \"b77426b6-744b-4410-adb5-006a49cf8f1d\") " pod="openshift-marketplace/redhat-marketplace-4zp77" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.129531 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5664\" (UniqueName: \"kubernetes.io/projected/b77426b6-744b-4410-adb5-006a49cf8f1d-kube-api-access-r5664\") pod \"redhat-marketplace-4zp77\" (UID: \"b77426b6-744b-4410-adb5-006a49cf8f1d\") " pod="openshift-marketplace/redhat-marketplace-4zp77" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.129582 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b77426b6-744b-4410-adb5-006a49cf8f1d-catalog-content\") pod \"redhat-marketplace-4zp77\" (UID: \"b77426b6-744b-4410-adb5-006a49cf8f1d\") " pod="openshift-marketplace/redhat-marketplace-4zp77" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.130085 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b77426b6-744b-4410-adb5-006a49cf8f1d-catalog-content\") pod \"redhat-marketplace-4zp77\" (UID: \"b77426b6-744b-4410-adb5-006a49cf8f1d\") " pod="openshift-marketplace/redhat-marketplace-4zp77" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.130303 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b77426b6-744b-4410-adb5-006a49cf8f1d-utilities\") pod \"redhat-marketplace-4zp77\" (UID: \"b77426b6-744b-4410-adb5-006a49cf8f1d\") " pod="openshift-marketplace/redhat-marketplace-4zp77" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.138442 4745 generic.go:334] "Generic (PLEG): container finished" podID="f725dd4f-d47f-4727-8722-88e91fe593b9" containerID="d6b7e7e2cfba426e9d7932779a3c53781fada7b88a12ff9dc321f77e1d0809bb" exitCode=0 Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.138560 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-prw8f" event={"ID":"f725dd4f-d47f-4727-8722-88e91fe593b9","Type":"ContainerDied","Data":"d6b7e7e2cfba426e9d7932779a3c53781fada7b88a12ff9dc321f77e1d0809bb"} Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.139040 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7rxgj"] Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.140089 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.142497 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.142830 4745 generic.go:334] "Generic (PLEG): container finished" podID="bc9c8707-e572-487a-bc95-08a093771e39" containerID="bd1efcdefdede4693fe82076504feafde157271c277b06e0838c32fe17f93bc6" exitCode=0 Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.142861 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n655r" event={"ID":"bc9c8707-e572-487a-bc95-08a093771e39","Type":"ContainerDied","Data":"bd1efcdefdede4693fe82076504feafde157271c277b06e0838c32fe17f93bc6"} Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.156562 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7rxgj"] Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.159879 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5664\" (UniqueName: \"kubernetes.io/projected/b77426b6-744b-4410-adb5-006a49cf8f1d-kube-api-access-r5664\") pod \"redhat-marketplace-4zp77\" (UID: \"b77426b6-744b-4410-adb5-006a49cf8f1d\") " pod="openshift-marketplace/redhat-marketplace-4zp77" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.231725 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5kp8\" (UniqueName: \"kubernetes.io/projected/bb9b0d93-98de-42bc-99c2-18b0072da5b3-kube-api-access-f5kp8\") pod \"redhat-operators-7rxgj\" (UID: \"bb9b0d93-98de-42bc-99c2-18b0072da5b3\") " pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.232224 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb9b0d93-98de-42bc-99c2-18b0072da5b3-catalog-content\") pod \"redhat-operators-7rxgj\" (UID: \"bb9b0d93-98de-42bc-99c2-18b0072da5b3\") " pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.232380 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb9b0d93-98de-42bc-99c2-18b0072da5b3-utilities\") pod \"redhat-operators-7rxgj\" (UID: \"bb9b0d93-98de-42bc-99c2-18b0072da5b3\") " pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.295262 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zp77" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.333565 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5kp8\" (UniqueName: \"kubernetes.io/projected/bb9b0d93-98de-42bc-99c2-18b0072da5b3-kube-api-access-f5kp8\") pod \"redhat-operators-7rxgj\" (UID: \"bb9b0d93-98de-42bc-99c2-18b0072da5b3\") " pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.334032 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb9b0d93-98de-42bc-99c2-18b0072da5b3-catalog-content\") pod \"redhat-operators-7rxgj\" (UID: \"bb9b0d93-98de-42bc-99c2-18b0072da5b3\") " pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.334093 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb9b0d93-98de-42bc-99c2-18b0072da5b3-utilities\") pod \"redhat-operators-7rxgj\" (UID: \"bb9b0d93-98de-42bc-99c2-18b0072da5b3\") " pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.334666 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb9b0d93-98de-42bc-99c2-18b0072da5b3-utilities\") pod \"redhat-operators-7rxgj\" (UID: \"bb9b0d93-98de-42bc-99c2-18b0072da5b3\") " pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.337051 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb9b0d93-98de-42bc-99c2-18b0072da5b3-catalog-content\") pod \"redhat-operators-7rxgj\" (UID: \"bb9b0d93-98de-42bc-99c2-18b0072da5b3\") " pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.368225 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5kp8\" (UniqueName: \"kubernetes.io/projected/bb9b0d93-98de-42bc-99c2-18b0072da5b3-kube-api-access-f5kp8\") pod \"redhat-operators-7rxgj\" (UID: \"bb9b0d93-98de-42bc-99c2-18b0072da5b3\") " pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.564503 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 11:37:12 crc kubenswrapper[4745]: I1209 11:37:12.766475 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zp77"] Dec 09 11:37:13 crc kubenswrapper[4745]: I1209 11:37:13.028724 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7rxgj"] Dec 09 11:37:13 crc kubenswrapper[4745]: I1209 11:37:13.151224 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n655r" event={"ID":"bc9c8707-e572-487a-bc95-08a093771e39","Type":"ContainerStarted","Data":"d204d2da6af8fc2a15bd84fd0b92f770f021f99c1bc1eeba518946f953e2ed76"} Dec 09 11:37:13 crc kubenswrapper[4745]: I1209 11:37:13.156288 4745 generic.go:334] "Generic (PLEG): container finished" podID="b77426b6-744b-4410-adb5-006a49cf8f1d" containerID="e883331d861b6eafb4fae9e2e973edd7eecc28d0050929b2adeca4931f9569f1" exitCode=0 Dec 09 11:37:13 crc kubenswrapper[4745]: I1209 11:37:13.156359 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zp77" event={"ID":"b77426b6-744b-4410-adb5-006a49cf8f1d","Type":"ContainerDied","Data":"e883331d861b6eafb4fae9e2e973edd7eecc28d0050929b2adeca4931f9569f1"} Dec 09 11:37:13 crc kubenswrapper[4745]: I1209 11:37:13.156387 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zp77" event={"ID":"b77426b6-744b-4410-adb5-006a49cf8f1d","Type":"ContainerStarted","Data":"5dbaa3d8a4adad993b6863201b59c76e9d60fd3ea5681af88205a6b1fa231c45"} Dec 09 11:37:13 crc kubenswrapper[4745]: I1209 11:37:13.158721 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rxgj" event={"ID":"bb9b0d93-98de-42bc-99c2-18b0072da5b3","Type":"ContainerStarted","Data":"6fd5a80a3e58595a6d88e732ff921886f2ed061ce05c3a96a63d3a4ebfb9ab93"} Dec 09 11:37:13 crc kubenswrapper[4745]: I1209 11:37:13.158762 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rxgj" event={"ID":"bb9b0d93-98de-42bc-99c2-18b0072da5b3","Type":"ContainerStarted","Data":"5e909146a75e93721e9acf51669c5c283e82a75d1b69ee30af02f22cc94c3da6"} Dec 09 11:37:13 crc kubenswrapper[4745]: I1209 11:37:13.160995 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-prw8f" event={"ID":"f725dd4f-d47f-4727-8722-88e91fe593b9","Type":"ContainerStarted","Data":"285137f9c7bf2642d0bfeba858c22f8f1f1a89bb9893271a45393ab92a7f6cd6"} Dec 09 11:37:13 crc kubenswrapper[4745]: I1209 11:37:13.188534 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n655r" podStartSLOduration=2.709200368 podStartE2EDuration="4.188502433s" podCreationTimestamp="2025-12-09 11:37:09 +0000 UTC" firstStartedPulling="2025-12-09 11:37:11.117482617 +0000 UTC m=+317.942684141" lastFinishedPulling="2025-12-09 11:37:12.596784682 +0000 UTC m=+319.421986206" observedRunningTime="2025-12-09 11:37:13.18507067 +0000 UTC m=+320.010272194" watchObservedRunningTime="2025-12-09 11:37:13.188502433 +0000 UTC m=+320.013703957" Dec 09 11:37:13 crc kubenswrapper[4745]: I1209 11:37:13.253465 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-prw8f" podStartSLOduration=2.815658347 podStartE2EDuration="4.253445246s" podCreationTimestamp="2025-12-09 11:37:09 +0000 UTC" firstStartedPulling="2025-12-09 11:37:11.115325728 +0000 UTC m=+317.940527252" lastFinishedPulling="2025-12-09 11:37:12.553112627 +0000 UTC m=+319.378314151" observedRunningTime="2025-12-09 11:37:13.250204548 +0000 UTC m=+320.075406082" watchObservedRunningTime="2025-12-09 11:37:13.253445246 +0000 UTC m=+320.078646780" Dec 09 11:37:14 crc kubenswrapper[4745]: I1209 11:37:14.168863 4745 generic.go:334] "Generic (PLEG): container finished" podID="bb9b0d93-98de-42bc-99c2-18b0072da5b3" containerID="6fd5a80a3e58595a6d88e732ff921886f2ed061ce05c3a96a63d3a4ebfb9ab93" exitCode=0 Dec 09 11:37:14 crc kubenswrapper[4745]: I1209 11:37:14.168957 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rxgj" event={"ID":"bb9b0d93-98de-42bc-99c2-18b0072da5b3","Type":"ContainerDied","Data":"6fd5a80a3e58595a6d88e732ff921886f2ed061ce05c3a96a63d3a4ebfb9ab93"} Dec 09 11:37:15 crc kubenswrapper[4745]: I1209 11:37:15.196879 4745 generic.go:334] "Generic (PLEG): container finished" podID="b77426b6-744b-4410-adb5-006a49cf8f1d" containerID="75f41ba007fe16bebdb626b0c792800f538a6569440a0e883f8e9c7ce4d5d701" exitCode=0 Dec 09 11:37:15 crc kubenswrapper[4745]: I1209 11:37:15.197635 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zp77" event={"ID":"b77426b6-744b-4410-adb5-006a49cf8f1d","Type":"ContainerDied","Data":"75f41ba007fe16bebdb626b0c792800f538a6569440a0e883f8e9c7ce4d5d701"} Dec 09 11:37:15 crc kubenswrapper[4745]: I1209 11:37:15.204000 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rxgj" event={"ID":"bb9b0d93-98de-42bc-99c2-18b0072da5b3","Type":"ContainerStarted","Data":"59dcb27350b95af5c2cb1b56c9aa07adb0e37da3d7f08e6735e90b7fb7c6c7e1"} Dec 09 11:37:15 crc kubenswrapper[4745]: I1209 11:37:15.486031 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8"] Dec 09 11:37:15 crc kubenswrapper[4745]: I1209 11:37:15.486594 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" podUID="697e0af4-fe0b-44e0-993d-367648ade37f" containerName="route-controller-manager" containerID="cri-o://b84c206daee265dbbef3917cc246f9d8a86a9ce44ac7c65b561506ee3b6132ec" gracePeriod=30 Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.221895 4745 generic.go:334] "Generic (PLEG): container finished" podID="697e0af4-fe0b-44e0-993d-367648ade37f" containerID="b84c206daee265dbbef3917cc246f9d8a86a9ce44ac7c65b561506ee3b6132ec" exitCode=0 Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.221997 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" event={"ID":"697e0af4-fe0b-44e0-993d-367648ade37f","Type":"ContainerDied","Data":"b84c206daee265dbbef3917cc246f9d8a86a9ce44ac7c65b561506ee3b6132ec"} Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.225377 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zp77" event={"ID":"b77426b6-744b-4410-adb5-006a49cf8f1d","Type":"ContainerStarted","Data":"8d12e5db0680f4afe0fc429cc83d0573bf5409fe19441101c6933c5f652b66ad"} Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.230239 4745 generic.go:334] "Generic (PLEG): container finished" podID="bb9b0d93-98de-42bc-99c2-18b0072da5b3" containerID="59dcb27350b95af5c2cb1b56c9aa07adb0e37da3d7f08e6735e90b7fb7c6c7e1" exitCode=0 Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.230310 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rxgj" event={"ID":"bb9b0d93-98de-42bc-99c2-18b0072da5b3","Type":"ContainerDied","Data":"59dcb27350b95af5c2cb1b56c9aa07adb0e37da3d7f08e6735e90b7fb7c6c7e1"} Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.241818 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4zp77" podStartSLOduration=2.576846765 podStartE2EDuration="5.241800974s" podCreationTimestamp="2025-12-09 11:37:11 +0000 UTC" firstStartedPulling="2025-12-09 11:37:13.157630916 +0000 UTC m=+319.982832440" lastFinishedPulling="2025-12-09 11:37:15.822585125 +0000 UTC m=+322.647786649" observedRunningTime="2025-12-09 11:37:16.241234269 +0000 UTC m=+323.066435793" watchObservedRunningTime="2025-12-09 11:37:16.241800974 +0000 UTC m=+323.067002498" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.490161 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.552373 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9"] Dec 09 11:37:16 crc kubenswrapper[4745]: E1209 11:37:16.552720 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="697e0af4-fe0b-44e0-993d-367648ade37f" containerName="route-controller-manager" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.552741 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="697e0af4-fe0b-44e0-993d-367648ade37f" containerName="route-controller-manager" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.552867 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="697e0af4-fe0b-44e0-993d-367648ade37f" containerName="route-controller-manager" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.553375 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.577253 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9"] Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.627076 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/697e0af4-fe0b-44e0-993d-367648ade37f-serving-cert\") pod \"697e0af4-fe0b-44e0-993d-367648ade37f\" (UID: \"697e0af4-fe0b-44e0-993d-367648ade37f\") " Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.627287 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/697e0af4-fe0b-44e0-993d-367648ade37f-client-ca\") pod \"697e0af4-fe0b-44e0-993d-367648ade37f\" (UID: \"697e0af4-fe0b-44e0-993d-367648ade37f\") " Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.627362 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9ftn\" (UniqueName: \"kubernetes.io/projected/697e0af4-fe0b-44e0-993d-367648ade37f-kube-api-access-b9ftn\") pod \"697e0af4-fe0b-44e0-993d-367648ade37f\" (UID: \"697e0af4-fe0b-44e0-993d-367648ade37f\") " Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.627399 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/697e0af4-fe0b-44e0-993d-367648ade37f-config\") pod \"697e0af4-fe0b-44e0-993d-367648ade37f\" (UID: \"697e0af4-fe0b-44e0-993d-367648ade37f\") " Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.628968 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/697e0af4-fe0b-44e0-993d-367648ade37f-config" (OuterVolumeSpecName: "config") pod "697e0af4-fe0b-44e0-993d-367648ade37f" (UID: "697e0af4-fe0b-44e0-993d-367648ade37f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.629545 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/697e0af4-fe0b-44e0-993d-367648ade37f-client-ca" (OuterVolumeSpecName: "client-ca") pod "697e0af4-fe0b-44e0-993d-367648ade37f" (UID: "697e0af4-fe0b-44e0-993d-367648ade37f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.650721 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/697e0af4-fe0b-44e0-993d-367648ade37f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "697e0af4-fe0b-44e0-993d-367648ade37f" (UID: "697e0af4-fe0b-44e0-993d-367648ade37f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.656289 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/697e0af4-fe0b-44e0-993d-367648ade37f-kube-api-access-b9ftn" (OuterVolumeSpecName: "kube-api-access-b9ftn") pod "697e0af4-fe0b-44e0-993d-367648ade37f" (UID: "697e0af4-fe0b-44e0-993d-367648ade37f"). InnerVolumeSpecName "kube-api-access-b9ftn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.728942 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd84d536-e551-458a-a219-68b9fc097601-client-ca\") pod \"route-controller-manager-cb78cf89c-4rwn9\" (UID: \"fd84d536-e551-458a-a219-68b9fc097601\") " pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.729059 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpcgl\" (UniqueName: \"kubernetes.io/projected/fd84d536-e551-458a-a219-68b9fc097601-kube-api-access-fpcgl\") pod \"route-controller-manager-cb78cf89c-4rwn9\" (UID: \"fd84d536-e551-458a-a219-68b9fc097601\") " pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.729107 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd84d536-e551-458a-a219-68b9fc097601-config\") pod \"route-controller-manager-cb78cf89c-4rwn9\" (UID: \"fd84d536-e551-458a-a219-68b9fc097601\") " pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.729187 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd84d536-e551-458a-a219-68b9fc097601-serving-cert\") pod \"route-controller-manager-cb78cf89c-4rwn9\" (UID: \"fd84d536-e551-458a-a219-68b9fc097601\") " pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.729243 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/697e0af4-fe0b-44e0-993d-367648ade37f-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.729262 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9ftn\" (UniqueName: \"kubernetes.io/projected/697e0af4-fe0b-44e0-993d-367648ade37f-kube-api-access-b9ftn\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.729276 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/697e0af4-fe0b-44e0-993d-367648ade37f-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.729288 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/697e0af4-fe0b-44e0-993d-367648ade37f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.830734 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd84d536-e551-458a-a219-68b9fc097601-config\") pod \"route-controller-manager-cb78cf89c-4rwn9\" (UID: \"fd84d536-e551-458a-a219-68b9fc097601\") " pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.830797 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd84d536-e551-458a-a219-68b9fc097601-serving-cert\") pod \"route-controller-manager-cb78cf89c-4rwn9\" (UID: \"fd84d536-e551-458a-a219-68b9fc097601\") " pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.830850 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd84d536-e551-458a-a219-68b9fc097601-client-ca\") pod \"route-controller-manager-cb78cf89c-4rwn9\" (UID: \"fd84d536-e551-458a-a219-68b9fc097601\") " pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.830920 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpcgl\" (UniqueName: \"kubernetes.io/projected/fd84d536-e551-458a-a219-68b9fc097601-kube-api-access-fpcgl\") pod \"route-controller-manager-cb78cf89c-4rwn9\" (UID: \"fd84d536-e551-458a-a219-68b9fc097601\") " pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.831949 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd84d536-e551-458a-a219-68b9fc097601-client-ca\") pod \"route-controller-manager-cb78cf89c-4rwn9\" (UID: \"fd84d536-e551-458a-a219-68b9fc097601\") " pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.832121 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd84d536-e551-458a-a219-68b9fc097601-config\") pod \"route-controller-manager-cb78cf89c-4rwn9\" (UID: \"fd84d536-e551-458a-a219-68b9fc097601\") " pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.835117 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd84d536-e551-458a-a219-68b9fc097601-serving-cert\") pod \"route-controller-manager-cb78cf89c-4rwn9\" (UID: \"fd84d536-e551-458a-a219-68b9fc097601\") " pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.848814 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpcgl\" (UniqueName: \"kubernetes.io/projected/fd84d536-e551-458a-a219-68b9fc097601-kube-api-access-fpcgl\") pod \"route-controller-manager-cb78cf89c-4rwn9\" (UID: \"fd84d536-e551-458a-a219-68b9fc097601\") " pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" Dec 09 11:37:16 crc kubenswrapper[4745]: I1209 11:37:16.872880 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" Dec 09 11:37:17 crc kubenswrapper[4745]: I1209 11:37:17.236439 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" Dec 09 11:37:17 crc kubenswrapper[4745]: I1209 11:37:17.236446 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8" event={"ID":"697e0af4-fe0b-44e0-993d-367648ade37f","Type":"ContainerDied","Data":"082bdceb6dabbd1606dd5fd15f4c8b83ca9fb5993b813d61f94d94fde0fa3a52"} Dec 09 11:37:17 crc kubenswrapper[4745]: I1209 11:37:17.237701 4745 scope.go:117] "RemoveContainer" containerID="b84c206daee265dbbef3917cc246f9d8a86a9ce44ac7c65b561506ee3b6132ec" Dec 09 11:37:17 crc kubenswrapper[4745]: I1209 11:37:17.276104 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8"] Dec 09 11:37:17 crc kubenswrapper[4745]: I1209 11:37:17.286538 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84864cfc78-9khs8"] Dec 09 11:37:17 crc kubenswrapper[4745]: I1209 11:37:17.484789 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9"] Dec 09 11:37:17 crc kubenswrapper[4745]: I1209 11:37:17.562771 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="697e0af4-fe0b-44e0-993d-367648ade37f" path="/var/lib/kubelet/pods/697e0af4-fe0b-44e0-993d-367648ade37f/volumes" Dec 09 11:37:18 crc kubenswrapper[4745]: I1209 11:37:18.244421 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" event={"ID":"fd84d536-e551-458a-a219-68b9fc097601","Type":"ContainerStarted","Data":"5e7dc8b623f383345e613ba1b17911db5d51bdac9ea23a8380a643a34ea041ec"} Dec 09 11:37:18 crc kubenswrapper[4745]: I1209 11:37:18.244466 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" event={"ID":"fd84d536-e551-458a-a219-68b9fc097601","Type":"ContainerStarted","Data":"b03ed328493d315407a61640b7a0baefea3eb31bb3baa339e7288244444ed68d"} Dec 09 11:37:18 crc kubenswrapper[4745]: I1209 11:37:18.245714 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" Dec 09 11:37:18 crc kubenswrapper[4745]: I1209 11:37:18.249481 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rxgj" event={"ID":"bb9b0d93-98de-42bc-99c2-18b0072da5b3","Type":"ContainerStarted","Data":"ed6da805d631cc91b5c088e71b1271d00273d41e60075197e15f9e8a2a022df5"} Dec 09 11:37:18 crc kubenswrapper[4745]: I1209 11:37:18.268949 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" podStartSLOduration=3.268922991 podStartE2EDuration="3.268922991s" podCreationTimestamp="2025-12-09 11:37:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:37:18.265987561 +0000 UTC m=+325.091189095" watchObservedRunningTime="2025-12-09 11:37:18.268922991 +0000 UTC m=+325.094124515" Dec 09 11:37:18 crc kubenswrapper[4745]: I1209 11:37:18.287033 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7rxgj" podStartSLOduration=2.397359788 podStartE2EDuration="6.287010332s" podCreationTimestamp="2025-12-09 11:37:12 +0000 UTC" firstStartedPulling="2025-12-09 11:37:13.159885247 +0000 UTC m=+319.985086771" lastFinishedPulling="2025-12-09 11:37:17.049535791 +0000 UTC m=+323.874737315" observedRunningTime="2025-12-09 11:37:18.285201163 +0000 UTC m=+325.110402687" watchObservedRunningTime="2025-12-09 11:37:18.287010332 +0000 UTC m=+325.112211856" Dec 09 11:37:18 crc kubenswrapper[4745]: I1209 11:37:18.619829 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-cb78cf89c-4rwn9" Dec 09 11:37:19 crc kubenswrapper[4745]: I1209 11:37:19.726712 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n655r" Dec 09 11:37:19 crc kubenswrapper[4745]: I1209 11:37:19.727141 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n655r" Dec 09 11:37:19 crc kubenswrapper[4745]: I1209 11:37:19.780745 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n655r" Dec 09 11:37:19 crc kubenswrapper[4745]: I1209 11:37:19.857576 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-prw8f" Dec 09 11:37:19 crc kubenswrapper[4745]: I1209 11:37:19.858151 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-prw8f" Dec 09 11:37:19 crc kubenswrapper[4745]: I1209 11:37:19.911331 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-prw8f" Dec 09 11:37:20 crc kubenswrapper[4745]: I1209 11:37:20.306390 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n655r" Dec 09 11:37:20 crc kubenswrapper[4745]: I1209 11:37:20.313632 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-prw8f" Dec 09 11:37:22 crc kubenswrapper[4745]: I1209 11:37:22.296218 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4zp77" Dec 09 11:37:22 crc kubenswrapper[4745]: I1209 11:37:22.296539 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4zp77" Dec 09 11:37:22 crc kubenswrapper[4745]: I1209 11:37:22.345616 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4zp77" Dec 09 11:37:22 crc kubenswrapper[4745]: I1209 11:37:22.565701 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 11:37:22 crc kubenswrapper[4745]: I1209 11:37:22.566032 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 11:37:22 crc kubenswrapper[4745]: I1209 11:37:22.606478 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 11:37:23 crc kubenswrapper[4745]: I1209 11:37:23.317264 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4zp77" Dec 09 11:37:23 crc kubenswrapper[4745]: I1209 11:37:23.322646 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 11:37:27 crc kubenswrapper[4745]: I1209 11:37:27.505959 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mfd2g" Dec 09 11:37:27 crc kubenswrapper[4745]: I1209 11:37:27.563749 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lsfbd"] Dec 09 11:37:52 crc kubenswrapper[4745]: I1209 11:37:52.604098 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" podUID="e774e15d-2c99-453e-9c78-4fde0bf037fc" containerName="registry" containerID="cri-o://43a3253e707964f834b1b644529c75e56c9463ad825625d843c8a25930c49e4c" gracePeriod=30 Dec 09 11:37:52 crc kubenswrapper[4745]: I1209 11:37:52.985707 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.162257 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e774e15d-2c99-453e-9c78-4fde0bf037fc-trusted-ca\") pod \"e774e15d-2c99-453e-9c78-4fde0bf037fc\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.162344 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e774e15d-2c99-453e-9c78-4fde0bf037fc-registry-certificates\") pod \"e774e15d-2c99-453e-9c78-4fde0bf037fc\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.162896 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e774e15d-2c99-453e-9c78-4fde0bf037fc\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.162935 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e774e15d-2c99-453e-9c78-4fde0bf037fc-ca-trust-extracted\") pod \"e774e15d-2c99-453e-9c78-4fde0bf037fc\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.163005 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-bound-sa-token\") pod \"e774e15d-2c99-453e-9c78-4fde0bf037fc\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.164042 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e774e15d-2c99-453e-9c78-4fde0bf037fc-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e774e15d-2c99-453e-9c78-4fde0bf037fc" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.164178 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e774e15d-2c99-453e-9c78-4fde0bf037fc-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e774e15d-2c99-453e-9c78-4fde0bf037fc" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.164552 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e774e15d-2c99-453e-9c78-4fde0bf037fc-installation-pull-secrets\") pod \"e774e15d-2c99-453e-9c78-4fde0bf037fc\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.164670 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-registry-tls\") pod \"e774e15d-2c99-453e-9c78-4fde0bf037fc\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.164775 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4cmw\" (UniqueName: \"kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-kube-api-access-j4cmw\") pod \"e774e15d-2c99-453e-9c78-4fde0bf037fc\" (UID: \"e774e15d-2c99-453e-9c78-4fde0bf037fc\") " Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.165426 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e774e15d-2c99-453e-9c78-4fde0bf037fc-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.165537 4745 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e774e15d-2c99-453e-9c78-4fde0bf037fc-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.172111 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e774e15d-2c99-453e-9c78-4fde0bf037fc" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.172369 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e774e15d-2c99-453e-9c78-4fde0bf037fc-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e774e15d-2c99-453e-9c78-4fde0bf037fc" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.172821 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-kube-api-access-j4cmw" (OuterVolumeSpecName: "kube-api-access-j4cmw") pod "e774e15d-2c99-453e-9c78-4fde0bf037fc" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc"). InnerVolumeSpecName "kube-api-access-j4cmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.173441 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e774e15d-2c99-453e-9c78-4fde0bf037fc" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.175724 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e774e15d-2c99-453e-9c78-4fde0bf037fc" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.181399 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e774e15d-2c99-453e-9c78-4fde0bf037fc-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e774e15d-2c99-453e-9c78-4fde0bf037fc" (UID: "e774e15d-2c99-453e-9c78-4fde0bf037fc"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.267415 4745 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e774e15d-2c99-453e-9c78-4fde0bf037fc-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.267477 4745 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.267490 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4cmw\" (UniqueName: \"kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-kube-api-access-j4cmw\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.267502 4745 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e774e15d-2c99-453e-9c78-4fde0bf037fc-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.267543 4745 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e774e15d-2c99-453e-9c78-4fde0bf037fc-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.477809 4745 generic.go:334] "Generic (PLEG): container finished" podID="e774e15d-2c99-453e-9c78-4fde0bf037fc" containerID="43a3253e707964f834b1b644529c75e56c9463ad825625d843c8a25930c49e4c" exitCode=0 Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.477887 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" event={"ID":"e774e15d-2c99-453e-9c78-4fde0bf037fc","Type":"ContainerDied","Data":"43a3253e707964f834b1b644529c75e56c9463ad825625d843c8a25930c49e4c"} Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.477937 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" event={"ID":"e774e15d-2c99-453e-9c78-4fde0bf037fc","Type":"ContainerDied","Data":"7c87d26157cab0336c08a51c3e6cf3d8641961b6c17d93b07405c62a5c728810"} Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.477967 4745 scope.go:117] "RemoveContainer" containerID="43a3253e707964f834b1b644529c75e56c9463ad825625d843c8a25930c49e4c" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.478590 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lsfbd" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.498595 4745 scope.go:117] "RemoveContainer" containerID="43a3253e707964f834b1b644529c75e56c9463ad825625d843c8a25930c49e4c" Dec 09 11:37:53 crc kubenswrapper[4745]: E1209 11:37:53.499299 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a3253e707964f834b1b644529c75e56c9463ad825625d843c8a25930c49e4c\": container with ID starting with 43a3253e707964f834b1b644529c75e56c9463ad825625d843c8a25930c49e4c not found: ID does not exist" containerID="43a3253e707964f834b1b644529c75e56c9463ad825625d843c8a25930c49e4c" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.499362 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a3253e707964f834b1b644529c75e56c9463ad825625d843c8a25930c49e4c"} err="failed to get container status \"43a3253e707964f834b1b644529c75e56c9463ad825625d843c8a25930c49e4c\": rpc error: code = NotFound desc = could not find container \"43a3253e707964f834b1b644529c75e56c9463ad825625d843c8a25930c49e4c\": container with ID starting with 43a3253e707964f834b1b644529c75e56c9463ad825625d843c8a25930c49e4c not found: ID does not exist" Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.515648 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lsfbd"] Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.520959 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lsfbd"] Dec 09 11:37:53 crc kubenswrapper[4745]: I1209 11:37:53.563784 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e774e15d-2c99-453e-9c78-4fde0bf037fc" path="/var/lib/kubelet/pods/e774e15d-2c99-453e-9c78-4fde0bf037fc/volumes" Dec 09 11:37:55 crc kubenswrapper[4745]: I1209 11:37:55.476150 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:37:55 crc kubenswrapper[4745]: I1209 11:37:55.476260 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:38:25 crc kubenswrapper[4745]: I1209 11:38:25.476024 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:38:25 crc kubenswrapper[4745]: I1209 11:38:25.477217 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:38:55 crc kubenswrapper[4745]: I1209 11:38:55.475639 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:38:55 crc kubenswrapper[4745]: I1209 11:38:55.476456 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:38:55 crc kubenswrapper[4745]: I1209 11:38:55.476581 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:38:55 crc kubenswrapper[4745]: I1209 11:38:55.477677 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"583f595c8ce3613fba4ec668e8f573ac6a0b119c60644409502306d93aab9036"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:38:55 crc kubenswrapper[4745]: I1209 11:38:55.477806 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://583f595c8ce3613fba4ec668e8f573ac6a0b119c60644409502306d93aab9036" gracePeriod=600 Dec 09 11:38:55 crc kubenswrapper[4745]: I1209 11:38:55.877093 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="583f595c8ce3613fba4ec668e8f573ac6a0b119c60644409502306d93aab9036" exitCode=0 Dec 09 11:38:55 crc kubenswrapper[4745]: I1209 11:38:55.877148 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"583f595c8ce3613fba4ec668e8f573ac6a0b119c60644409502306d93aab9036"} Dec 09 11:38:55 crc kubenswrapper[4745]: I1209 11:38:55.877727 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"4e5b153d182042c7d46d35f6288637157b6053d650b2c82b502c8e527fd9296b"} Dec 09 11:38:55 crc kubenswrapper[4745]: I1209 11:38:55.877773 4745 scope.go:117] "RemoveContainer" containerID="bad313723f1c245da1b0da10913c97ca1cea620e750b3d1883c64de4cd304186" Dec 09 11:40:53 crc kubenswrapper[4745]: I1209 11:40:53.803045 4745 scope.go:117] "RemoveContainer" containerID="b1802f63dac16f2ce7c1e7cea0c0119508c516f81fd2506fd1079bbfa9fe5738" Dec 09 11:40:55 crc kubenswrapper[4745]: I1209 11:40:55.475705 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:40:55 crc kubenswrapper[4745]: I1209 11:40:55.475789 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:41:25 crc kubenswrapper[4745]: I1209 11:41:25.475582 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:41:25 crc kubenswrapper[4745]: I1209 11:41:25.476096 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:41:53 crc kubenswrapper[4745]: I1209 11:41:53.833208 4745 scope.go:117] "RemoveContainer" containerID="dc9484e866b5e4ab3c62825d5ffdd13102693c6e9c335a7f03303aa2ea181d16" Dec 09 11:41:53 crc kubenswrapper[4745]: I1209 11:41:53.852792 4745 scope.go:117] "RemoveContainer" containerID="e9210d7e735e5ec52351ab20d682fa553685f6786e68031040a8478a6cc1ccbf" Dec 09 11:41:55 crc kubenswrapper[4745]: I1209 11:41:55.474982 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:41:55 crc kubenswrapper[4745]: I1209 11:41:55.476150 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:41:55 crc kubenswrapper[4745]: I1209 11:41:55.476297 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:41:55 crc kubenswrapper[4745]: I1209 11:41:55.477636 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e5b153d182042c7d46d35f6288637157b6053d650b2c82b502c8e527fd9296b"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:41:55 crc kubenswrapper[4745]: I1209 11:41:55.477703 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://4e5b153d182042c7d46d35f6288637157b6053d650b2c82b502c8e527fd9296b" gracePeriod=600 Dec 09 11:41:55 crc kubenswrapper[4745]: I1209 11:41:55.988465 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="4e5b153d182042c7d46d35f6288637157b6053d650b2c82b502c8e527fd9296b" exitCode=0 Dec 09 11:41:55 crc kubenswrapper[4745]: I1209 11:41:55.988559 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"4e5b153d182042c7d46d35f6288637157b6053d650b2c82b502c8e527fd9296b"} Dec 09 11:41:55 crc kubenswrapper[4745]: I1209 11:41:55.988817 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"ca79ed29df3b434b12e331278bd50ec4ed3dddcbfabf7d15e6a04655fcdb16e2"} Dec 09 11:41:55 crc kubenswrapper[4745]: I1209 11:41:55.988851 4745 scope.go:117] "RemoveContainer" containerID="583f595c8ce3613fba4ec668e8f573ac6a0b119c60644409502306d93aab9036" Dec 09 11:43:55 crc kubenswrapper[4745]: I1209 11:43:55.475198 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:43:55 crc kubenswrapper[4745]: I1209 11:43:55.475738 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:44:12 crc kubenswrapper[4745]: I1209 11:44:12.559639 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vwrlh"] Dec 09 11:44:12 crc kubenswrapper[4745]: I1209 11:44:12.560525 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovn-controller" containerID="cri-o://235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884" gracePeriod=30 Dec 09 11:44:12 crc kubenswrapper[4745]: I1209 11:44:12.560883 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="sbdb" containerID="cri-o://6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f" gracePeriod=30 Dec 09 11:44:12 crc kubenswrapper[4745]: I1209 11:44:12.560928 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="nbdb" containerID="cri-o://64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb" gracePeriod=30 Dec 09 11:44:12 crc kubenswrapper[4745]: I1209 11:44:12.560961 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="northd" containerID="cri-o://30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed" gracePeriod=30 Dec 09 11:44:12 crc kubenswrapper[4745]: I1209 11:44:12.560990 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e" gracePeriod=30 Dec 09 11:44:12 crc kubenswrapper[4745]: I1209 11:44:12.561016 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="kube-rbac-proxy-node" containerID="cri-o://fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e" gracePeriod=30 Dec 09 11:44:12 crc kubenswrapper[4745]: I1209 11:44:12.561045 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovn-acl-logging" containerID="cri-o://a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e" gracePeriod=30 Dec 09 11:44:12 crc kubenswrapper[4745]: I1209 11:44:12.636579 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" containerID="cri-o://fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39" gracePeriod=30 Dec 09 11:44:12 crc kubenswrapper[4745]: I1209 11:44:12.768074 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovnkube-controller/3.log" Dec 09 11:44:12 crc kubenswrapper[4745]: I1209 11:44:12.770242 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovn-acl-logging/0.log" Dec 09 11:44:12 crc kubenswrapper[4745]: I1209 11:44:12.770775 4745 generic.go:334] "Generic (PLEG): container finished" podID="ac484d76-f5da-4880-868d-1e2e5289c025" containerID="14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e" exitCode=0 Dec 09 11:44:12 crc kubenswrapper[4745]: I1209 11:44:12.770800 4745 generic.go:334] "Generic (PLEG): container finished" podID="ac484d76-f5da-4880-868d-1e2e5289c025" containerID="a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e" exitCode=143 Dec 09 11:44:12 crc kubenswrapper[4745]: I1209 11:44:12.770818 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerDied","Data":"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e"} Dec 09 11:44:12 crc kubenswrapper[4745]: I1209 11:44:12.770843 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerDied","Data":"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e"} Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.609481 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39 is running failed: container process not found" containerID="fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.610433 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39 is running failed: container process not found" containerID="fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.610737 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39 is running failed: container process not found" containerID="fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.610791 4745 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.763628 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovnkube-controller/3.log" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.765949 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovn-acl-logging/0.log" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.766459 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovn-controller/0.log" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.767129 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.778631 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovnkube-controller/3.log" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.780642 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovn-acl-logging/0.log" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781135 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vwrlh_ac484d76-f5da-4880-868d-1e2e5289c025/ovn-controller/0.log" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781465 4745 generic.go:334] "Generic (PLEG): container finished" podID="ac484d76-f5da-4880-868d-1e2e5289c025" containerID="fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39" exitCode=0 Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781486 4745 generic.go:334] "Generic (PLEG): container finished" podID="ac484d76-f5da-4880-868d-1e2e5289c025" containerID="6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f" exitCode=0 Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781494 4745 generic.go:334] "Generic (PLEG): container finished" podID="ac484d76-f5da-4880-868d-1e2e5289c025" containerID="64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb" exitCode=0 Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781501 4745 generic.go:334] "Generic (PLEG): container finished" podID="ac484d76-f5da-4880-868d-1e2e5289c025" containerID="30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed" exitCode=0 Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781522 4745 generic.go:334] "Generic (PLEG): container finished" podID="ac484d76-f5da-4880-868d-1e2e5289c025" containerID="fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e" exitCode=0 Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781531 4745 generic.go:334] "Generic (PLEG): container finished" podID="ac484d76-f5da-4880-868d-1e2e5289c025" containerID="235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884" exitCode=143 Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781561 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781586 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerDied","Data":"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781614 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerDied","Data":"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781624 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerDied","Data":"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781633 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerDied","Data":"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781641 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerDied","Data":"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781650 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerDied","Data":"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781660 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781669 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781675 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781681 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781686 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781692 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781696 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781701 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781706 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781712 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vwrlh" event={"ID":"ac484d76-f5da-4880-868d-1e2e5289c025","Type":"ContainerDied","Data":"eaeb40d855e600fe6ba0f9e1295fea6a8d194601cfcd2688bd440f71cbd0036c"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781720 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781726 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781761 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781767 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781773 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781778 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781784 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781789 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781796 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781801 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.781815 4745 scope.go:117] "RemoveContainer" containerID="fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.783426 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r6gmj_1002b34d-f671-4b20-bf4f-492ce3295cc4/kube-multus/2.log" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.783922 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r6gmj_1002b34d-f671-4b20-bf4f-492ce3295cc4/kube-multus/1.log" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.783966 4745 generic.go:334] "Generic (PLEG): container finished" podID="1002b34d-f671-4b20-bf4f-492ce3295cc4" containerID="a5f2ca3ad6920f0955798ebdde6246cdc51cf6e11a8a1877ceec813979e4dd16" exitCode=2 Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.784001 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r6gmj" event={"ID":"1002b34d-f671-4b20-bf4f-492ce3295cc4","Type":"ContainerDied","Data":"a5f2ca3ad6920f0955798ebdde6246cdc51cf6e11a8a1877ceec813979e4dd16"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.784025 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e88ce3abcb49b7d6904538b5b2ce3a0a151159de54b316e2824c860f29b18a38"} Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.784446 4745 scope.go:117] "RemoveContainer" containerID="a5f2ca3ad6920f0955798ebdde6246cdc51cf6e11a8a1877ceec813979e4dd16" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.789471 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-systemd-units\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.789806 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.790013 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-cni-netd\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.790110 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-ovnkube-config\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.790202 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-var-lib-openvswitch\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.790360 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-run-ovn-kubernetes\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.790473 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-etc-openvswitch\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.790844 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-kubelet\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.790957 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-env-overrides\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.791019 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-cni-bin\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.791094 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dwc4\" (UniqueName: \"kubernetes.io/projected/ac484d76-f5da-4880-868d-1e2e5289c025-kube-api-access-6dwc4\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.791186 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-ovnkube-script-lib\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.791297 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-log-socket\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.791366 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-run-netns\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.791791 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-slash\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.791878 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-node-log\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.791935 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-systemd\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.792005 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-ovn\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.792064 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac484d76-f5da-4880-868d-1e2e5289c025-ovn-node-metrics-cert\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.792237 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-openvswitch\") pod \"ac484d76-f5da-4880-868d-1e2e5289c025\" (UID: \"ac484d76-f5da-4880-868d-1e2e5289c025\") " Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.789766 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.789970 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.790231 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.790321 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.790443 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.790714 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.790752 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.790913 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.791395 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.791474 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-log-socket" (OuterVolumeSpecName: "log-socket") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.791524 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.791756 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.792211 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.792246 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-node-log" (OuterVolumeSpecName: "node-log") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.792270 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-slash" (OuterVolumeSpecName: "host-slash") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.792297 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.793307 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.798006 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac484d76-f5da-4880-868d-1e2e5289c025-kube-api-access-6dwc4" (OuterVolumeSpecName: "kube-api-access-6dwc4") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "kube-api-access-6dwc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.808715 4745 scope.go:117] "RemoveContainer" containerID="9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.809088 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac484d76-f5da-4880-868d-1e2e5289c025-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.822547 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "ac484d76-f5da-4880-868d-1e2e5289c025" (UID: "ac484d76-f5da-4880-868d-1e2e5289c025"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.827486 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nhmqd"] Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.831418 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovn-acl-logging" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.831641 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovn-acl-logging" Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.831739 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.831820 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.831899 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="sbdb" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.831976 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="sbdb" Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.832054 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="kubecfg-setup" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.832143 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="kubecfg-setup" Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.832216 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="nbdb" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.832285 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="nbdb" Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.832359 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="kube-rbac-proxy-ovn-metrics" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.832425 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="kube-rbac-proxy-ovn-metrics" Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.832491 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.832591 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.832659 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.832729 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.832788 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="kube-rbac-proxy-node" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.832851 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="kube-rbac-proxy-node" Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.832913 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.832967 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.833021 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.833077 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.833130 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovn-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.833182 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovn-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.833237 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="northd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.833289 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="northd" Dec 09 11:44:13 crc kubenswrapper[4745]: E1209 11:44:13.833342 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e774e15d-2c99-453e-9c78-4fde0bf037fc" containerName="registry" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.833394 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e774e15d-2c99-453e-9c78-4fde0bf037fc" containerName="registry" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.833586 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.833656 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.833706 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e774e15d-2c99-453e-9c78-4fde0bf037fc" containerName="registry" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.833759 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="kube-rbac-proxy-ovn-metrics" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.833813 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.833866 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.833923 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="nbdb" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.833971 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovn-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.834026 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovn-acl-logging" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.834081 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="sbdb" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.834131 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="northd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.834186 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="kube-rbac-proxy-node" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.834442 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" containerName="ovnkube-controller" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.836244 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.854769 4745 scope.go:117] "RemoveContainer" containerID="6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893226 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-node-log\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893291 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-run-openvswitch\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893316 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-run-systemd\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893353 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-run-netns\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893376 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6flkc\" (UniqueName: \"kubernetes.io/projected/0a950c62-84c5-423d-8f64-c3ea4105d050-kube-api-access-6flkc\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893400 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a950c62-84c5-423d-8f64-c3ea4105d050-ovnkube-config\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893417 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0a950c62-84c5-423d-8f64-c3ea4105d050-ovnkube-script-lib\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893441 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-slash\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893466 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-kubelet\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893492 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-cni-netd\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893527 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a950c62-84c5-423d-8f64-c3ea4105d050-env-overrides\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893554 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-cni-bin\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893588 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a950c62-84c5-423d-8f64-c3ea4105d050-ovn-node-metrics-cert\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893609 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-etc-openvswitch\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893626 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893647 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-run-ovn-kubernetes\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893668 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-var-lib-openvswitch\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893691 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-systemd-units\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893704 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-log-socket\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893719 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-run-ovn\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893764 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dwc4\" (UniqueName: \"kubernetes.io/projected/ac484d76-f5da-4880-868d-1e2e5289c025-kube-api-access-6dwc4\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893774 4745 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893783 4745 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-log-socket\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893791 4745 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893800 4745 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-slash\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893808 4745 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-node-log\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893817 4745 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893827 4745 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893837 4745 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac484d76-f5da-4880-868d-1e2e5289c025-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893844 4745 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893852 4745 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893861 4745 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893870 4745 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893879 4745 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893887 4745 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893897 4745 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893905 4745 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893912 4745 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893920 4745 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac484d76-f5da-4880-868d-1e2e5289c025-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.893928 4745 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac484d76-f5da-4880-868d-1e2e5289c025-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.894885 4745 scope.go:117] "RemoveContainer" containerID="64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.924413 4745 scope.go:117] "RemoveContainer" containerID="30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.940570 4745 scope.go:117] "RemoveContainer" containerID="14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.969764 4745 scope.go:117] "RemoveContainer" containerID="fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.989372 4745 scope.go:117] "RemoveContainer" containerID="a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.994697 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-run-openvswitch\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.994752 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-run-systemd\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.994790 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-run-netns\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.994852 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6flkc\" (UniqueName: \"kubernetes.io/projected/0a950c62-84c5-423d-8f64-c3ea4105d050-kube-api-access-6flkc\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.994883 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a950c62-84c5-423d-8f64-c3ea4105d050-ovnkube-config\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.994884 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-run-openvswitch\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.994905 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0a950c62-84c5-423d-8f64-c3ea4105d050-ovnkube-script-lib\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995035 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-slash\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995088 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-kubelet\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995144 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-cni-netd\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995184 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-cni-bin\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995215 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a950c62-84c5-423d-8f64-c3ea4105d050-env-overrides\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995258 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a950c62-84c5-423d-8f64-c3ea4105d050-ovn-node-metrics-cert\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995313 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-etc-openvswitch\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995337 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-run-netns\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995349 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995389 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.994897 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-run-systemd\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995425 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-run-ovn-kubernetes\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995440 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-slash\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995472 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-kubelet\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995526 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-run-ovn-kubernetes\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995558 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-cni-netd\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995531 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-var-lib-openvswitch\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995605 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-host-cni-bin\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995485 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-var-lib-openvswitch\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995643 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-etc-openvswitch\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995763 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-systemd-units\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995815 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-log-socket\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995846 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-run-ovn\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995888 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-node-log\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995912 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-systemd-units\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.995941 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-log-socket\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.996011 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-node-log\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.996247 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a950c62-84c5-423d-8f64-c3ea4105d050-env-overrides\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.996306 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a950c62-84c5-423d-8f64-c3ea4105d050-ovnkube-config\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.996371 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0a950c62-84c5-423d-8f64-c3ea4105d050-ovnkube-script-lib\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:13 crc kubenswrapper[4745]: I1209 11:44:13.996183 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a950c62-84c5-423d-8f64-c3ea4105d050-run-ovn\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.000048 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a950c62-84c5-423d-8f64-c3ea4105d050-ovn-node-metrics-cert\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.006301 4745 scope.go:117] "RemoveContainer" containerID="235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.018242 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6flkc\" (UniqueName: \"kubernetes.io/projected/0a950c62-84c5-423d-8f64-c3ea4105d050-kube-api-access-6flkc\") pod \"ovnkube-node-nhmqd\" (UID: \"0a950c62-84c5-423d-8f64-c3ea4105d050\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.022653 4745 scope.go:117] "RemoveContainer" containerID="77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.039463 4745 scope.go:117] "RemoveContainer" containerID="fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39" Dec 09 11:44:14 crc kubenswrapper[4745]: E1209 11:44:14.039937 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39\": container with ID starting with fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39 not found: ID does not exist" containerID="fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.039976 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39"} err="failed to get container status \"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39\": rpc error: code = NotFound desc = could not find container \"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39\": container with ID starting with fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39 not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.040017 4745 scope.go:117] "RemoveContainer" containerID="9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4" Dec 09 11:44:14 crc kubenswrapper[4745]: E1209 11:44:14.040250 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\": container with ID starting with 9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4 not found: ID does not exist" containerID="9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.040276 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4"} err="failed to get container status \"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\": rpc error: code = NotFound desc = could not find container \"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\": container with ID starting with 9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4 not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.040298 4745 scope.go:117] "RemoveContainer" containerID="6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f" Dec 09 11:44:14 crc kubenswrapper[4745]: E1209 11:44:14.041220 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\": container with ID starting with 6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f not found: ID does not exist" containerID="6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.041242 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f"} err="failed to get container status \"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\": rpc error: code = NotFound desc = could not find container \"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\": container with ID starting with 6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.041258 4745 scope.go:117] "RemoveContainer" containerID="64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb" Dec 09 11:44:14 crc kubenswrapper[4745]: E1209 11:44:14.041558 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\": container with ID starting with 64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb not found: ID does not exist" containerID="64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.041580 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb"} err="failed to get container status \"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\": rpc error: code = NotFound desc = could not find container \"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\": container with ID starting with 64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.041596 4745 scope.go:117] "RemoveContainer" containerID="30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed" Dec 09 11:44:14 crc kubenswrapper[4745]: E1209 11:44:14.041970 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\": container with ID starting with 30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed not found: ID does not exist" containerID="30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.041991 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed"} err="failed to get container status \"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\": rpc error: code = NotFound desc = could not find container \"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\": container with ID starting with 30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.042008 4745 scope.go:117] "RemoveContainer" containerID="14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e" Dec 09 11:44:14 crc kubenswrapper[4745]: E1209 11:44:14.042302 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\": container with ID starting with 14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e not found: ID does not exist" containerID="14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.042345 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e"} err="failed to get container status \"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\": rpc error: code = NotFound desc = could not find container \"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\": container with ID starting with 14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.042365 4745 scope.go:117] "RemoveContainer" containerID="fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e" Dec 09 11:44:14 crc kubenswrapper[4745]: E1209 11:44:14.042669 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\": container with ID starting with fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e not found: ID does not exist" containerID="fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.042733 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e"} err="failed to get container status \"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\": rpc error: code = NotFound desc = could not find container \"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\": container with ID starting with fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.042756 4745 scope.go:117] "RemoveContainer" containerID="a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e" Dec 09 11:44:14 crc kubenswrapper[4745]: E1209 11:44:14.043113 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\": container with ID starting with a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e not found: ID does not exist" containerID="a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.043140 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e"} err="failed to get container status \"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\": rpc error: code = NotFound desc = could not find container \"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\": container with ID starting with a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.043157 4745 scope.go:117] "RemoveContainer" containerID="235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884" Dec 09 11:44:14 crc kubenswrapper[4745]: E1209 11:44:14.043540 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\": container with ID starting with 235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884 not found: ID does not exist" containerID="235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.043567 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884"} err="failed to get container status \"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\": rpc error: code = NotFound desc = could not find container \"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\": container with ID starting with 235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884 not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.043588 4745 scope.go:117] "RemoveContainer" containerID="77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e" Dec 09 11:44:14 crc kubenswrapper[4745]: E1209 11:44:14.043822 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\": container with ID starting with 77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e not found: ID does not exist" containerID="77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.043846 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e"} err="failed to get container status \"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\": rpc error: code = NotFound desc = could not find container \"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\": container with ID starting with 77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.043863 4745 scope.go:117] "RemoveContainer" containerID="fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.044192 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39"} err="failed to get container status \"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39\": rpc error: code = NotFound desc = could not find container \"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39\": container with ID starting with fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39 not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.044219 4745 scope.go:117] "RemoveContainer" containerID="9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.044502 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4"} err="failed to get container status \"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\": rpc error: code = NotFound desc = could not find container \"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\": container with ID starting with 9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4 not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.044543 4745 scope.go:117] "RemoveContainer" containerID="6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.044757 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f"} err="failed to get container status \"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\": rpc error: code = NotFound desc = could not find container \"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\": container with ID starting with 6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.044780 4745 scope.go:117] "RemoveContainer" containerID="64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.045014 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb"} err="failed to get container status \"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\": rpc error: code = NotFound desc = could not find container \"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\": container with ID starting with 64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.045036 4745 scope.go:117] "RemoveContainer" containerID="30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.045241 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed"} err="failed to get container status \"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\": rpc error: code = NotFound desc = could not find container \"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\": container with ID starting with 30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.045264 4745 scope.go:117] "RemoveContainer" containerID="14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.045459 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e"} err="failed to get container status \"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\": rpc error: code = NotFound desc = could not find container \"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\": container with ID starting with 14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.045482 4745 scope.go:117] "RemoveContainer" containerID="fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.045739 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e"} err="failed to get container status \"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\": rpc error: code = NotFound desc = could not find container \"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\": container with ID starting with fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.045764 4745 scope.go:117] "RemoveContainer" containerID="a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.045997 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e"} err="failed to get container status \"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\": rpc error: code = NotFound desc = could not find container \"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\": container with ID starting with a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.046019 4745 scope.go:117] "RemoveContainer" containerID="235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.046342 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884"} err="failed to get container status \"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\": rpc error: code = NotFound desc = could not find container \"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\": container with ID starting with 235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884 not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.046369 4745 scope.go:117] "RemoveContainer" containerID="77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.046603 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e"} err="failed to get container status \"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\": rpc error: code = NotFound desc = could not find container \"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\": container with ID starting with 77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.046627 4745 scope.go:117] "RemoveContainer" containerID="fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.046878 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39"} err="failed to get container status \"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39\": rpc error: code = NotFound desc = could not find container \"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39\": container with ID starting with fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39 not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.046903 4745 scope.go:117] "RemoveContainer" containerID="9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.047149 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4"} err="failed to get container status \"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\": rpc error: code = NotFound desc = could not find container \"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\": container with ID starting with 9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4 not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.047172 4745 scope.go:117] "RemoveContainer" containerID="6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.047397 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f"} err="failed to get container status \"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\": rpc error: code = NotFound desc = could not find container \"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\": container with ID starting with 6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.047419 4745 scope.go:117] "RemoveContainer" containerID="64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.047785 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb"} err="failed to get container status \"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\": rpc error: code = NotFound desc = could not find container \"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\": container with ID starting with 64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.047808 4745 scope.go:117] "RemoveContainer" containerID="30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.048066 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed"} err="failed to get container status \"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\": rpc error: code = NotFound desc = could not find container \"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\": container with ID starting with 30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.048090 4745 scope.go:117] "RemoveContainer" containerID="14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.048343 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e"} err="failed to get container status \"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\": rpc error: code = NotFound desc = could not find container \"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\": container with ID starting with 14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.048371 4745 scope.go:117] "RemoveContainer" containerID="fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.048578 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e"} err="failed to get container status \"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\": rpc error: code = NotFound desc = could not find container \"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\": container with ID starting with fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.048638 4745 scope.go:117] "RemoveContainer" containerID="a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.048913 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e"} err="failed to get container status \"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\": rpc error: code = NotFound desc = could not find container \"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\": container with ID starting with a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.048942 4745 scope.go:117] "RemoveContainer" containerID="235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.049343 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884"} err="failed to get container status \"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\": rpc error: code = NotFound desc = could not find container \"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\": container with ID starting with 235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884 not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.049366 4745 scope.go:117] "RemoveContainer" containerID="77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.049577 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e"} err="failed to get container status \"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\": rpc error: code = NotFound desc = could not find container \"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\": container with ID starting with 77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.049604 4745 scope.go:117] "RemoveContainer" containerID="fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.049819 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39"} err="failed to get container status \"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39\": rpc error: code = NotFound desc = could not find container \"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39\": container with ID starting with fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39 not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.049841 4745 scope.go:117] "RemoveContainer" containerID="9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.050033 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4"} err="failed to get container status \"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\": rpc error: code = NotFound desc = could not find container \"9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4\": container with ID starting with 9176c6b83c5a00c413eb7956835ba3c14938401282ddb8081e214935ba10a9d4 not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.050055 4745 scope.go:117] "RemoveContainer" containerID="6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.050298 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f"} err="failed to get container status \"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\": rpc error: code = NotFound desc = could not find container \"6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f\": container with ID starting with 6dc8f4ad48b3ebe663007508102ec4ce8c4d63e79337d329a756fa0c9e77a68f not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.050321 4745 scope.go:117] "RemoveContainer" containerID="64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.050551 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb"} err="failed to get container status \"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\": rpc error: code = NotFound desc = could not find container \"64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb\": container with ID starting with 64e3c33889e904cfcd601731cdf1276ef705ba974397c5bf133bb64ee17ed2cb not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.050573 4745 scope.go:117] "RemoveContainer" containerID="30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.051098 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed"} err="failed to get container status \"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\": rpc error: code = NotFound desc = could not find container \"30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed\": container with ID starting with 30804feaa7dedc4ae49bb1414da41cabe631b34fc56cb645eb086d88ee7e72ed not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.051122 4745 scope.go:117] "RemoveContainer" containerID="14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.051388 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e"} err="failed to get container status \"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\": rpc error: code = NotFound desc = could not find container \"14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e\": container with ID starting with 14c9b6746d0a796b2ecdc90a138a691e3443f8d3e2e44f575439f43ba42e3e1e not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.051414 4745 scope.go:117] "RemoveContainer" containerID="fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.051750 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e"} err="failed to get container status \"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\": rpc error: code = NotFound desc = could not find container \"fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e\": container with ID starting with fa20af197052256c736162933efb0eba6afd478ae20437a173df538e21cee33e not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.051774 4745 scope.go:117] "RemoveContainer" containerID="a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.052001 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e"} err="failed to get container status \"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\": rpc error: code = NotFound desc = could not find container \"a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e\": container with ID starting with a56835214c08c0d90982a349c0f44aba4c6adde9bea54ae9ec586cd4a0b2828e not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.052027 4745 scope.go:117] "RemoveContainer" containerID="235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.052242 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884"} err="failed to get container status \"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\": rpc error: code = NotFound desc = could not find container \"235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884\": container with ID starting with 235f036efec587aa8c2e5130f2813d366c5a748e67289f24955346399f2ff884 not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.052265 4745 scope.go:117] "RemoveContainer" containerID="77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.052431 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e"} err="failed to get container status \"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\": rpc error: code = NotFound desc = could not find container \"77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e\": container with ID starting with 77b229685850f560ff08440c47bc819a1d7ec57ff99113cc68b0d4556ef35a6e not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.052458 4745 scope.go:117] "RemoveContainer" containerID="fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.052702 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39"} err="failed to get container status \"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39\": rpc error: code = NotFound desc = could not find container \"fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39\": container with ID starting with fc35cbc21efcaa6e26cef57949ef9923fdf64fa0a73a2899810ad3f5d809ab39 not found: ID does not exist" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.111640 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vwrlh"] Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.120104 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vwrlh"] Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.192902 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:14 crc kubenswrapper[4745]: I1209 11:44:14.791473 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" event={"ID":"0a950c62-84c5-423d-8f64-c3ea4105d050","Type":"ContainerStarted","Data":"4562c3b94e9b6c71a237dc9eb573560ba68c92927dc152bc336e8e39a26dfed5"} Dec 09 11:44:15 crc kubenswrapper[4745]: I1209 11:44:15.566408 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac484d76-f5da-4880-868d-1e2e5289c025" path="/var/lib/kubelet/pods/ac484d76-f5da-4880-868d-1e2e5289c025/volumes" Dec 09 11:44:15 crc kubenswrapper[4745]: I1209 11:44:15.799348 4745 generic.go:334] "Generic (PLEG): container finished" podID="0a950c62-84c5-423d-8f64-c3ea4105d050" containerID="7ebe42a4f697bbd211571d363f5c7d8a9241f469df5bac53bb0e27a611753d01" exitCode=0 Dec 09 11:44:15 crc kubenswrapper[4745]: I1209 11:44:15.799480 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" event={"ID":"0a950c62-84c5-423d-8f64-c3ea4105d050","Type":"ContainerDied","Data":"7ebe42a4f697bbd211571d363f5c7d8a9241f469df5bac53bb0e27a611753d01"} Dec 09 11:44:15 crc kubenswrapper[4745]: I1209 11:44:15.804157 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r6gmj_1002b34d-f671-4b20-bf4f-492ce3295cc4/kube-multus/2.log" Dec 09 11:44:15 crc kubenswrapper[4745]: I1209 11:44:15.805985 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r6gmj_1002b34d-f671-4b20-bf4f-492ce3295cc4/kube-multus/1.log" Dec 09 11:44:15 crc kubenswrapper[4745]: I1209 11:44:15.806111 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r6gmj" event={"ID":"1002b34d-f671-4b20-bf4f-492ce3295cc4","Type":"ContainerStarted","Data":"f2e5e078a1c308139f9bd051d5cfcfcc0c79829042437247f21fe598fa68ba77"} Dec 09 11:44:16 crc kubenswrapper[4745]: I1209 11:44:16.815066 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" event={"ID":"0a950c62-84c5-423d-8f64-c3ea4105d050","Type":"ContainerStarted","Data":"0c2e32cda5d7618e5e2409099b92dabe4ad4dbdbecb0f338134e999371a2c2ae"} Dec 09 11:44:19 crc kubenswrapper[4745]: I1209 11:44:19.833695 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" event={"ID":"0a950c62-84c5-423d-8f64-c3ea4105d050","Type":"ContainerStarted","Data":"b591261e272826649464ba8187ea3c698100a6d111f48b054d24559d18ac6f92"} Dec 09 11:44:19 crc kubenswrapper[4745]: I1209 11:44:19.834144 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" event={"ID":"0a950c62-84c5-423d-8f64-c3ea4105d050","Type":"ContainerStarted","Data":"9a7ba1ffeca2e120c72944454443c59292dedf8c6fe56ef4fdeebb2de40dc938"} Dec 09 11:44:20 crc kubenswrapper[4745]: I1209 11:44:20.842171 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" event={"ID":"0a950c62-84c5-423d-8f64-c3ea4105d050","Type":"ContainerStarted","Data":"187a82df1f8ef0794322ab5cd317d209fdf11a05991d87f9d158c9a9379b88bf"} Dec 09 11:44:22 crc kubenswrapper[4745]: I1209 11:44:22.891355 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-bz4vs"] Dec 09 11:44:22 crc kubenswrapper[4745]: I1209 11:44:22.892294 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:22 crc kubenswrapper[4745]: I1209 11:44:22.894223 4745 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-dt72b" Dec 09 11:44:22 crc kubenswrapper[4745]: I1209 11:44:22.894635 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 09 11:44:22 crc kubenswrapper[4745]: I1209 11:44:22.894698 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 09 11:44:22 crc kubenswrapper[4745]: I1209 11:44:22.894843 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 09 11:44:22 crc kubenswrapper[4745]: I1209 11:44:22.915600 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jthz\" (UniqueName: \"kubernetes.io/projected/16f1f969-bc75-46d9-b92a-5c26f91fbaca-kube-api-access-2jthz\") pod \"crc-storage-crc-bz4vs\" (UID: \"16f1f969-bc75-46d9-b92a-5c26f91fbaca\") " pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:22 crc kubenswrapper[4745]: I1209 11:44:22.915704 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/16f1f969-bc75-46d9-b92a-5c26f91fbaca-crc-storage\") pod \"crc-storage-crc-bz4vs\" (UID: \"16f1f969-bc75-46d9-b92a-5c26f91fbaca\") " pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:22 crc kubenswrapper[4745]: I1209 11:44:22.915747 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/16f1f969-bc75-46d9-b92a-5c26f91fbaca-node-mnt\") pod \"crc-storage-crc-bz4vs\" (UID: \"16f1f969-bc75-46d9-b92a-5c26f91fbaca\") " pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:23 crc kubenswrapper[4745]: I1209 11:44:23.017361 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/16f1f969-bc75-46d9-b92a-5c26f91fbaca-node-mnt\") pod \"crc-storage-crc-bz4vs\" (UID: \"16f1f969-bc75-46d9-b92a-5c26f91fbaca\") " pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:23 crc kubenswrapper[4745]: I1209 11:44:23.017705 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jthz\" (UniqueName: \"kubernetes.io/projected/16f1f969-bc75-46d9-b92a-5c26f91fbaca-kube-api-access-2jthz\") pod \"crc-storage-crc-bz4vs\" (UID: \"16f1f969-bc75-46d9-b92a-5c26f91fbaca\") " pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:23 crc kubenswrapper[4745]: I1209 11:44:23.017835 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/16f1f969-bc75-46d9-b92a-5c26f91fbaca-crc-storage\") pod \"crc-storage-crc-bz4vs\" (UID: \"16f1f969-bc75-46d9-b92a-5c26f91fbaca\") " pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:23 crc kubenswrapper[4745]: I1209 11:44:23.017715 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/16f1f969-bc75-46d9-b92a-5c26f91fbaca-node-mnt\") pod \"crc-storage-crc-bz4vs\" (UID: \"16f1f969-bc75-46d9-b92a-5c26f91fbaca\") " pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:23 crc kubenswrapper[4745]: I1209 11:44:23.018580 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/16f1f969-bc75-46d9-b92a-5c26f91fbaca-crc-storage\") pod \"crc-storage-crc-bz4vs\" (UID: \"16f1f969-bc75-46d9-b92a-5c26f91fbaca\") " pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:23 crc kubenswrapper[4745]: I1209 11:44:23.034539 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jthz\" (UniqueName: \"kubernetes.io/projected/16f1f969-bc75-46d9-b92a-5c26f91fbaca-kube-api-access-2jthz\") pod \"crc-storage-crc-bz4vs\" (UID: \"16f1f969-bc75-46d9-b92a-5c26f91fbaca\") " pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:23 crc kubenswrapper[4745]: I1209 11:44:23.205160 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:23 crc kubenswrapper[4745]: E1209 11:44:23.228689 4745 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bz4vs_crc-storage_16f1f969-bc75-46d9-b92a-5c26f91fbaca_0(37641cafc766659095756df211743029ca1de6b9cd3dab3158f432776f1ef75b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 11:44:23 crc kubenswrapper[4745]: E1209 11:44:23.228772 4745 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bz4vs_crc-storage_16f1f969-bc75-46d9-b92a-5c26f91fbaca_0(37641cafc766659095756df211743029ca1de6b9cd3dab3158f432776f1ef75b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:23 crc kubenswrapper[4745]: E1209 11:44:23.228800 4745 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bz4vs_crc-storage_16f1f969-bc75-46d9-b92a-5c26f91fbaca_0(37641cafc766659095756df211743029ca1de6b9cd3dab3158f432776f1ef75b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:23 crc kubenswrapper[4745]: E1209 11:44:23.228866 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-bz4vs_crc-storage(16f1f969-bc75-46d9-b92a-5c26f91fbaca)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-bz4vs_crc-storage(16f1f969-bc75-46d9-b92a-5c26f91fbaca)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bz4vs_crc-storage_16f1f969-bc75-46d9-b92a-5c26f91fbaca_0(37641cafc766659095756df211743029ca1de6b9cd3dab3158f432776f1ef75b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-bz4vs" podUID="16f1f969-bc75-46d9-b92a-5c26f91fbaca" Dec 09 11:44:25 crc kubenswrapper[4745]: I1209 11:44:25.475272 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:44:25 crc kubenswrapper[4745]: I1209 11:44:25.475667 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:44:31 crc kubenswrapper[4745]: I1209 11:44:31.907988 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" event={"ID":"0a950c62-84c5-423d-8f64-c3ea4105d050","Type":"ContainerStarted","Data":"719a1ea82f3ea50a8cab77ce3e67380a8fdf742740346d31ca003be84868c4e5"} Dec 09 11:44:32 crc kubenswrapper[4745]: I1209 11:44:32.169346 4745 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 09 11:44:32 crc kubenswrapper[4745]: I1209 11:44:32.919099 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" event={"ID":"0a950c62-84c5-423d-8f64-c3ea4105d050","Type":"ContainerStarted","Data":"b7de3f530166b15c433c4fa5f00dc9732124ab87f6ae94e5fdd96cf4b6fc8c9c"} Dec 09 11:44:33 crc kubenswrapper[4745]: I1209 11:44:33.932194 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" event={"ID":"0a950c62-84c5-423d-8f64-c3ea4105d050","Type":"ContainerStarted","Data":"bcaa3c0933d95a9c2fe12954de0b7d8ffe75972fc7945588a476986584d59f88"} Dec 09 11:44:34 crc kubenswrapper[4745]: I1209 11:44:34.942899 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" event={"ID":"0a950c62-84c5-423d-8f64-c3ea4105d050","Type":"ContainerStarted","Data":"f9085c0f23f3eddedb963a78fecc598b5214476d3bafa3821b562476c6919aed"} Dec 09 11:44:34 crc kubenswrapper[4745]: I1209 11:44:34.943288 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:34 crc kubenswrapper[4745]: I1209 11:44:34.973669 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:35 crc kubenswrapper[4745]: I1209 11:44:35.002645 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" podStartSLOduration=22.00262803 podStartE2EDuration="22.00262803s" podCreationTimestamp="2025-12-09 11:44:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:44:34.96896885 +0000 UTC m=+761.794170384" watchObservedRunningTime="2025-12-09 11:44:35.00262803 +0000 UTC m=+761.827829554" Dec 09 11:44:35 crc kubenswrapper[4745]: I1209 11:44:35.338306 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bz4vs"] Dec 09 11:44:35 crc kubenswrapper[4745]: I1209 11:44:35.338495 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:35 crc kubenswrapper[4745]: I1209 11:44:35.339122 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:35 crc kubenswrapper[4745]: E1209 11:44:35.364983 4745 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bz4vs_crc-storage_16f1f969-bc75-46d9-b92a-5c26f91fbaca_0(c502e95ea99dad4fc0fa5481c0d8f71d534d2596226c2bb4bea10b95a2fd049e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 11:44:35 crc kubenswrapper[4745]: E1209 11:44:35.365061 4745 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bz4vs_crc-storage_16f1f969-bc75-46d9-b92a-5c26f91fbaca_0(c502e95ea99dad4fc0fa5481c0d8f71d534d2596226c2bb4bea10b95a2fd049e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:35 crc kubenswrapper[4745]: E1209 11:44:35.365087 4745 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bz4vs_crc-storage_16f1f969-bc75-46d9-b92a-5c26f91fbaca_0(c502e95ea99dad4fc0fa5481c0d8f71d534d2596226c2bb4bea10b95a2fd049e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:35 crc kubenswrapper[4745]: E1209 11:44:35.365140 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-bz4vs_crc-storage(16f1f969-bc75-46d9-b92a-5c26f91fbaca)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-bz4vs_crc-storage(16f1f969-bc75-46d9-b92a-5c26f91fbaca)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bz4vs_crc-storage_16f1f969-bc75-46d9-b92a-5c26f91fbaca_0(c502e95ea99dad4fc0fa5481c0d8f71d534d2596226c2bb4bea10b95a2fd049e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-bz4vs" podUID="16f1f969-bc75-46d9-b92a-5c26f91fbaca" Dec 09 11:44:35 crc kubenswrapper[4745]: I1209 11:44:35.948990 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:35 crc kubenswrapper[4745]: I1209 11:44:35.949327 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:35 crc kubenswrapper[4745]: I1209 11:44:35.977778 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:44 crc kubenswrapper[4745]: I1209 11:44:44.218908 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhmqd" Dec 09 11:44:49 crc kubenswrapper[4745]: I1209 11:44:49.554257 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:49 crc kubenswrapper[4745]: I1209 11:44:49.555409 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:49 crc kubenswrapper[4745]: I1209 11:44:49.749849 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bz4vs"] Dec 09 11:44:49 crc kubenswrapper[4745]: I1209 11:44:49.759037 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 11:44:50 crc kubenswrapper[4745]: I1209 11:44:50.031116 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bz4vs" event={"ID":"16f1f969-bc75-46d9-b92a-5c26f91fbaca","Type":"ContainerStarted","Data":"a3c01842a09a0f7ecc7e409bb38499276b4581a7fcb20c8343f215eefe1f6403"} Dec 09 11:44:53 crc kubenswrapper[4745]: I1209 11:44:53.897765 4745 scope.go:117] "RemoveContainer" containerID="e88ce3abcb49b7d6904538b5b2ce3a0a151159de54b316e2824c860f29b18a38" Dec 09 11:44:55 crc kubenswrapper[4745]: I1209 11:44:55.058385 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r6gmj_1002b34d-f671-4b20-bf4f-492ce3295cc4/kube-multus/2.log" Dec 09 11:44:55 crc kubenswrapper[4745]: I1209 11:44:55.475427 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:44:55 crc kubenswrapper[4745]: I1209 11:44:55.475577 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:44:55 crc kubenswrapper[4745]: I1209 11:44:55.475649 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:44:55 crc kubenswrapper[4745]: I1209 11:44:55.476746 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ca79ed29df3b434b12e331278bd50ec4ed3dddcbfabf7d15e6a04655fcdb16e2"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:44:55 crc kubenswrapper[4745]: I1209 11:44:55.476885 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://ca79ed29df3b434b12e331278bd50ec4ed3dddcbfabf7d15e6a04655fcdb16e2" gracePeriod=600 Dec 09 11:44:56 crc kubenswrapper[4745]: I1209 11:44:56.067051 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="ca79ed29df3b434b12e331278bd50ec4ed3dddcbfabf7d15e6a04655fcdb16e2" exitCode=0 Dec 09 11:44:56 crc kubenswrapper[4745]: I1209 11:44:56.067132 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"ca79ed29df3b434b12e331278bd50ec4ed3dddcbfabf7d15e6a04655fcdb16e2"} Dec 09 11:44:56 crc kubenswrapper[4745]: I1209 11:44:56.067419 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"6250b21aafb6740efccc2c70a8117a20c7fea1d66182660852cc7179196a393f"} Dec 09 11:44:56 crc kubenswrapper[4745]: I1209 11:44:56.067441 4745 scope.go:117] "RemoveContainer" containerID="4e5b153d182042c7d46d35f6288637157b6053d650b2c82b502c8e527fd9296b" Dec 09 11:44:56 crc kubenswrapper[4745]: I1209 11:44:56.069946 4745 generic.go:334] "Generic (PLEG): container finished" podID="16f1f969-bc75-46d9-b92a-5c26f91fbaca" containerID="11180372a49bbb5426938b6b1a0c43f0f531db334e4834a0848bee720cc6991d" exitCode=0 Dec 09 11:44:56 crc kubenswrapper[4745]: I1209 11:44:56.070089 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bz4vs" event={"ID":"16f1f969-bc75-46d9-b92a-5c26f91fbaca","Type":"ContainerDied","Data":"11180372a49bbb5426938b6b1a0c43f0f531db334e4834a0848bee720cc6991d"} Dec 09 11:44:57 crc kubenswrapper[4745]: I1209 11:44:57.329047 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:44:57 crc kubenswrapper[4745]: I1209 11:44:57.491961 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/16f1f969-bc75-46d9-b92a-5c26f91fbaca-crc-storage\") pod \"16f1f969-bc75-46d9-b92a-5c26f91fbaca\" (UID: \"16f1f969-bc75-46d9-b92a-5c26f91fbaca\") " Dec 09 11:44:57 crc kubenswrapper[4745]: I1209 11:44:57.492074 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/16f1f969-bc75-46d9-b92a-5c26f91fbaca-node-mnt\") pod \"16f1f969-bc75-46d9-b92a-5c26f91fbaca\" (UID: \"16f1f969-bc75-46d9-b92a-5c26f91fbaca\") " Dec 09 11:44:57 crc kubenswrapper[4745]: I1209 11:44:57.492113 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jthz\" (UniqueName: \"kubernetes.io/projected/16f1f969-bc75-46d9-b92a-5c26f91fbaca-kube-api-access-2jthz\") pod \"16f1f969-bc75-46d9-b92a-5c26f91fbaca\" (UID: \"16f1f969-bc75-46d9-b92a-5c26f91fbaca\") " Dec 09 11:44:57 crc kubenswrapper[4745]: I1209 11:44:57.492708 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16f1f969-bc75-46d9-b92a-5c26f91fbaca-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "16f1f969-bc75-46d9-b92a-5c26f91fbaca" (UID: "16f1f969-bc75-46d9-b92a-5c26f91fbaca"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:44:57 crc kubenswrapper[4745]: I1209 11:44:57.505874 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16f1f969-bc75-46d9-b92a-5c26f91fbaca-kube-api-access-2jthz" (OuterVolumeSpecName: "kube-api-access-2jthz") pod "16f1f969-bc75-46d9-b92a-5c26f91fbaca" (UID: "16f1f969-bc75-46d9-b92a-5c26f91fbaca"). InnerVolumeSpecName "kube-api-access-2jthz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:44:57 crc kubenswrapper[4745]: I1209 11:44:57.508942 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16f1f969-bc75-46d9-b92a-5c26f91fbaca-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "16f1f969-bc75-46d9-b92a-5c26f91fbaca" (UID: "16f1f969-bc75-46d9-b92a-5c26f91fbaca"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:44:57 crc kubenswrapper[4745]: I1209 11:44:57.594197 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jthz\" (UniqueName: \"kubernetes.io/projected/16f1f969-bc75-46d9-b92a-5c26f91fbaca-kube-api-access-2jthz\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:57 crc kubenswrapper[4745]: I1209 11:44:57.594231 4745 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/16f1f969-bc75-46d9-b92a-5c26f91fbaca-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:57 crc kubenswrapper[4745]: I1209 11:44:57.594239 4745 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/16f1f969-bc75-46d9-b92a-5c26f91fbaca-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 09 11:44:58 crc kubenswrapper[4745]: I1209 11:44:58.087558 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bz4vs" event={"ID":"16f1f969-bc75-46d9-b92a-5c26f91fbaca","Type":"ContainerDied","Data":"a3c01842a09a0f7ecc7e409bb38499276b4581a7fcb20c8343f215eefe1f6403"} Dec 09 11:44:58 crc kubenswrapper[4745]: I1209 11:44:58.087600 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3c01842a09a0f7ecc7e409bb38499276b4581a7fcb20c8343f215eefe1f6403" Dec 09 11:44:58 crc kubenswrapper[4745]: I1209 11:44:58.087652 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bz4vs" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.150263 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb"] Dec 09 11:45:00 crc kubenswrapper[4745]: E1209 11:45:00.151899 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16f1f969-bc75-46d9-b92a-5c26f91fbaca" containerName="storage" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.151981 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f1f969-bc75-46d9-b92a-5c26f91fbaca" containerName="storage" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.152157 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="16f1f969-bc75-46d9-b92a-5c26f91fbaca" containerName="storage" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.152732 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.156227 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.157079 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.166697 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb"] Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.231666 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-config-volume\") pod \"collect-profiles-29421345-fdcrb\" (UID: \"4e6b0836-fccb-4178-94f4-99d6c2d93c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.231810 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bds4x\" (UniqueName: \"kubernetes.io/projected/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-kube-api-access-bds4x\") pod \"collect-profiles-29421345-fdcrb\" (UID: \"4e6b0836-fccb-4178-94f4-99d6c2d93c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.231929 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-secret-volume\") pod \"collect-profiles-29421345-fdcrb\" (UID: \"4e6b0836-fccb-4178-94f4-99d6c2d93c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.332365 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-config-volume\") pod \"collect-profiles-29421345-fdcrb\" (UID: \"4e6b0836-fccb-4178-94f4-99d6c2d93c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.332456 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bds4x\" (UniqueName: \"kubernetes.io/projected/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-kube-api-access-bds4x\") pod \"collect-profiles-29421345-fdcrb\" (UID: \"4e6b0836-fccb-4178-94f4-99d6c2d93c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.332530 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-secret-volume\") pod \"collect-profiles-29421345-fdcrb\" (UID: \"4e6b0836-fccb-4178-94f4-99d6c2d93c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.334445 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-config-volume\") pod \"collect-profiles-29421345-fdcrb\" (UID: \"4e6b0836-fccb-4178-94f4-99d6c2d93c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.345414 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-secret-volume\") pod \"collect-profiles-29421345-fdcrb\" (UID: \"4e6b0836-fccb-4178-94f4-99d6c2d93c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.348711 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bds4x\" (UniqueName: \"kubernetes.io/projected/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-kube-api-access-bds4x\") pod \"collect-profiles-29421345-fdcrb\" (UID: \"4e6b0836-fccb-4178-94f4-99d6c2d93c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.472176 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb" Dec 09 11:45:00 crc kubenswrapper[4745]: I1209 11:45:00.658309 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb"] Dec 09 11:45:00 crc kubenswrapper[4745]: W1209 11:45:00.662916 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e6b0836_fccb_4178_94f4_99d6c2d93c1a.slice/crio-66afbb3502cea3fa82fdffc39fe197049e9d3c71c253c6c6e36e03579c3f0ad6 WatchSource:0}: Error finding container 66afbb3502cea3fa82fdffc39fe197049e9d3c71c253c6c6e36e03579c3f0ad6: Status 404 returned error can't find the container with id 66afbb3502cea3fa82fdffc39fe197049e9d3c71c253c6c6e36e03579c3f0ad6 Dec 09 11:45:01 crc kubenswrapper[4745]: I1209 11:45:01.104348 4745 generic.go:334] "Generic (PLEG): container finished" podID="4e6b0836-fccb-4178-94f4-99d6c2d93c1a" containerID="d8e43ca97105c5e0d445175330bb609be6eba3fc05d3026582d9082087b979c9" exitCode=0 Dec 09 11:45:01 crc kubenswrapper[4745]: I1209 11:45:01.104463 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb" event={"ID":"4e6b0836-fccb-4178-94f4-99d6c2d93c1a","Type":"ContainerDied","Data":"d8e43ca97105c5e0d445175330bb609be6eba3fc05d3026582d9082087b979c9"} Dec 09 11:45:01 crc kubenswrapper[4745]: I1209 11:45:01.104650 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb" event={"ID":"4e6b0836-fccb-4178-94f4-99d6c2d93c1a","Type":"ContainerStarted","Data":"66afbb3502cea3fa82fdffc39fe197049e9d3c71c253c6c6e36e03579c3f0ad6"} Dec 09 11:45:02 crc kubenswrapper[4745]: I1209 11:45:02.341144 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb" Dec 09 11:45:02 crc kubenswrapper[4745]: I1209 11:45:02.458869 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-config-volume\") pod \"4e6b0836-fccb-4178-94f4-99d6c2d93c1a\" (UID: \"4e6b0836-fccb-4178-94f4-99d6c2d93c1a\") " Dec 09 11:45:02 crc kubenswrapper[4745]: I1209 11:45:02.459033 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-secret-volume\") pod \"4e6b0836-fccb-4178-94f4-99d6c2d93c1a\" (UID: \"4e6b0836-fccb-4178-94f4-99d6c2d93c1a\") " Dec 09 11:45:02 crc kubenswrapper[4745]: I1209 11:45:02.459094 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bds4x\" (UniqueName: \"kubernetes.io/projected/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-kube-api-access-bds4x\") pod \"4e6b0836-fccb-4178-94f4-99d6c2d93c1a\" (UID: \"4e6b0836-fccb-4178-94f4-99d6c2d93c1a\") " Dec 09 11:45:02 crc kubenswrapper[4745]: I1209 11:45:02.460294 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-config-volume" (OuterVolumeSpecName: "config-volume") pod "4e6b0836-fccb-4178-94f4-99d6c2d93c1a" (UID: "4e6b0836-fccb-4178-94f4-99d6c2d93c1a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:45:02 crc kubenswrapper[4745]: I1209 11:45:02.465071 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4e6b0836-fccb-4178-94f4-99d6c2d93c1a" (UID: "4e6b0836-fccb-4178-94f4-99d6c2d93c1a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:45:02 crc kubenswrapper[4745]: I1209 11:45:02.471755 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-kube-api-access-bds4x" (OuterVolumeSpecName: "kube-api-access-bds4x") pod "4e6b0836-fccb-4178-94f4-99d6c2d93c1a" (UID: "4e6b0836-fccb-4178-94f4-99d6c2d93c1a"). InnerVolumeSpecName "kube-api-access-bds4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:45:02 crc kubenswrapper[4745]: I1209 11:45:02.560349 4745 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:02 crc kubenswrapper[4745]: I1209 11:45:02.560381 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bds4x\" (UniqueName: \"kubernetes.io/projected/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-kube-api-access-bds4x\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:02 crc kubenswrapper[4745]: I1209 11:45:02.560391 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e6b0836-fccb-4178-94f4-99d6c2d93c1a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:03 crc kubenswrapper[4745]: I1209 11:45:03.116689 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb" event={"ID":"4e6b0836-fccb-4178-94f4-99d6c2d93c1a","Type":"ContainerDied","Data":"66afbb3502cea3fa82fdffc39fe197049e9d3c71c253c6c6e36e03579c3f0ad6"} Dec 09 11:45:03 crc kubenswrapper[4745]: I1209 11:45:03.116745 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66afbb3502cea3fa82fdffc39fe197049e9d3c71c253c6c6e36e03579c3f0ad6" Dec 09 11:45:03 crc kubenswrapper[4745]: I1209 11:45:03.116713 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb" Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.041415 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2"] Dec 09 11:45:05 crc kubenswrapper[4745]: E1209 11:45:05.042784 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6b0836-fccb-4178-94f4-99d6c2d93c1a" containerName="collect-profiles" Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.042855 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6b0836-fccb-4178-94f4-99d6c2d93c1a" containerName="collect-profiles" Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.043052 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6b0836-fccb-4178-94f4-99d6c2d93c1a" containerName="collect-profiles" Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.043925 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.048001 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.052641 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2"] Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.195403 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2\" (UID: \"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.195483 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2\" (UID: \"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.195554 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cslnf\" (UniqueName: \"kubernetes.io/projected/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-kube-api-access-cslnf\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2\" (UID: \"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.296754 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2\" (UID: \"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.296845 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cslnf\" (UniqueName: \"kubernetes.io/projected/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-kube-api-access-cslnf\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2\" (UID: \"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.296903 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2\" (UID: \"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.297596 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2\" (UID: \"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.297645 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2\" (UID: \"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.316537 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cslnf\" (UniqueName: \"kubernetes.io/projected/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-kube-api-access-cslnf\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2\" (UID: \"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.359487 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" Dec 09 11:45:05 crc kubenswrapper[4745]: I1209 11:45:05.536213 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2"] Dec 09 11:45:05 crc kubenswrapper[4745]: W1209 11:45:05.544480 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ce7d845_3d2d_409d_be8e_8dd3687cbc4e.slice/crio-64027cca1b41f34d2aaba3fce85409cb13d84d14d2a24a32fce9aeaa5fc7277c WatchSource:0}: Error finding container 64027cca1b41f34d2aaba3fce85409cb13d84d14d2a24a32fce9aeaa5fc7277c: Status 404 returned error can't find the container with id 64027cca1b41f34d2aaba3fce85409cb13d84d14d2a24a32fce9aeaa5fc7277c Dec 09 11:45:06 crc kubenswrapper[4745]: I1209 11:45:06.135867 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" event={"ID":"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e","Type":"ContainerStarted","Data":"5223b6228191c246591e7c82020d3239bfc2e4e44cb05ec163624e46d4a2bffe"} Dec 09 11:45:06 crc kubenswrapper[4745]: I1209 11:45:06.136181 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" event={"ID":"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e","Type":"ContainerStarted","Data":"64027cca1b41f34d2aaba3fce85409cb13d84d14d2a24a32fce9aeaa5fc7277c"} Dec 09 11:45:07 crc kubenswrapper[4745]: I1209 11:45:07.145234 4745 generic.go:334] "Generic (PLEG): container finished" podID="4ce7d845-3d2d-409d-be8e-8dd3687cbc4e" containerID="5223b6228191c246591e7c82020d3239bfc2e4e44cb05ec163624e46d4a2bffe" exitCode=0 Dec 09 11:45:07 crc kubenswrapper[4745]: I1209 11:45:07.145283 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" event={"ID":"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e","Type":"ContainerDied","Data":"5223b6228191c246591e7c82020d3239bfc2e4e44cb05ec163624e46d4a2bffe"} Dec 09 11:45:07 crc kubenswrapper[4745]: I1209 11:45:07.337062 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fldk8"] Dec 09 11:45:07 crc kubenswrapper[4745]: I1209 11:45:07.338188 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:07 crc kubenswrapper[4745]: I1209 11:45:07.345290 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fldk8"] Dec 09 11:45:07 crc kubenswrapper[4745]: I1209 11:45:07.524118 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-utilities\") pod \"redhat-operators-fldk8\" (UID: \"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1\") " pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:07 crc kubenswrapper[4745]: I1209 11:45:07.524165 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-catalog-content\") pod \"redhat-operators-fldk8\" (UID: \"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1\") " pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:07 crc kubenswrapper[4745]: I1209 11:45:07.524397 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmt6x\" (UniqueName: \"kubernetes.io/projected/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-kube-api-access-wmt6x\") pod \"redhat-operators-fldk8\" (UID: \"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1\") " pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:07 crc kubenswrapper[4745]: I1209 11:45:07.625937 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-utilities\") pod \"redhat-operators-fldk8\" (UID: \"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1\") " pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:07 crc kubenswrapper[4745]: I1209 11:45:07.626189 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-catalog-content\") pod \"redhat-operators-fldk8\" (UID: \"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1\") " pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:07 crc kubenswrapper[4745]: I1209 11:45:07.626274 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmt6x\" (UniqueName: \"kubernetes.io/projected/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-kube-api-access-wmt6x\") pod \"redhat-operators-fldk8\" (UID: \"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1\") " pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:07 crc kubenswrapper[4745]: I1209 11:45:07.626470 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-utilities\") pod \"redhat-operators-fldk8\" (UID: \"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1\") " pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:07 crc kubenswrapper[4745]: I1209 11:45:07.626571 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-catalog-content\") pod \"redhat-operators-fldk8\" (UID: \"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1\") " pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:07 crc kubenswrapper[4745]: I1209 11:45:07.647226 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmt6x\" (UniqueName: \"kubernetes.io/projected/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-kube-api-access-wmt6x\") pod \"redhat-operators-fldk8\" (UID: \"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1\") " pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:07 crc kubenswrapper[4745]: I1209 11:45:07.708894 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:08 crc kubenswrapper[4745]: I1209 11:45:08.114224 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fldk8"] Dec 09 11:45:08 crc kubenswrapper[4745]: W1209 11:45:08.120075 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05cd5c7c_eb47_43f0_a13d_84f9da6ba5e1.slice/crio-ec234fae0827f56b3d3cf5d52063362523ba4ffc98b4b47767fcef3faddde997 WatchSource:0}: Error finding container ec234fae0827f56b3d3cf5d52063362523ba4ffc98b4b47767fcef3faddde997: Status 404 returned error can't find the container with id ec234fae0827f56b3d3cf5d52063362523ba4ffc98b4b47767fcef3faddde997 Dec 09 11:45:08 crc kubenswrapper[4745]: I1209 11:45:08.157255 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fldk8" event={"ID":"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1","Type":"ContainerStarted","Data":"ec234fae0827f56b3d3cf5d52063362523ba4ffc98b4b47767fcef3faddde997"} Dec 09 11:45:09 crc kubenswrapper[4745]: I1209 11:45:09.163784 4745 generic.go:334] "Generic (PLEG): container finished" podID="4ce7d845-3d2d-409d-be8e-8dd3687cbc4e" containerID="f508d92587fb73183677a4328bd08115dffadc63b64188785d990810452ba832" exitCode=0 Dec 09 11:45:09 crc kubenswrapper[4745]: I1209 11:45:09.163967 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" event={"ID":"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e","Type":"ContainerDied","Data":"f508d92587fb73183677a4328bd08115dffadc63b64188785d990810452ba832"} Dec 09 11:45:09 crc kubenswrapper[4745]: I1209 11:45:09.165491 4745 generic.go:334] "Generic (PLEG): container finished" podID="05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1" containerID="2ed633d574bb380a80b2be0a5ab4437aa8c457b87f8570e4244f7f127e5477a6" exitCode=0 Dec 09 11:45:09 crc kubenswrapper[4745]: I1209 11:45:09.165545 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fldk8" event={"ID":"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1","Type":"ContainerDied","Data":"2ed633d574bb380a80b2be0a5ab4437aa8c457b87f8570e4244f7f127e5477a6"} Dec 09 11:45:10 crc kubenswrapper[4745]: I1209 11:45:10.173001 4745 generic.go:334] "Generic (PLEG): container finished" podID="4ce7d845-3d2d-409d-be8e-8dd3687cbc4e" containerID="f4de65465606a8842d82210612784abe71d2ebc27ed98007f69dfa1dd9a323fe" exitCode=0 Dec 09 11:45:10 crc kubenswrapper[4745]: I1209 11:45:10.173064 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" event={"ID":"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e","Type":"ContainerDied","Data":"f4de65465606a8842d82210612784abe71d2ebc27ed98007f69dfa1dd9a323fe"} Dec 09 11:45:10 crc kubenswrapper[4745]: I1209 11:45:10.183701 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fldk8" event={"ID":"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1","Type":"ContainerStarted","Data":"0d7b8f05c903ffebf9ee8a1ac189598d22e0f5415063962574443a38c9b9e27d"} Dec 09 11:45:12 crc kubenswrapper[4745]: I1209 11:45:12.146128 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" Dec 09 11:45:12 crc kubenswrapper[4745]: I1209 11:45:12.250865 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" event={"ID":"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e","Type":"ContainerDied","Data":"64027cca1b41f34d2aaba3fce85409cb13d84d14d2a24a32fce9aeaa5fc7277c"} Dec 09 11:45:12 crc kubenswrapper[4745]: I1209 11:45:12.250916 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2" Dec 09 11:45:12 crc kubenswrapper[4745]: I1209 11:45:12.250921 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64027cca1b41f34d2aaba3fce85409cb13d84d14d2a24a32fce9aeaa5fc7277c" Dec 09 11:45:12 crc kubenswrapper[4745]: I1209 11:45:12.375410 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cslnf\" (UniqueName: \"kubernetes.io/projected/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-kube-api-access-cslnf\") pod \"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e\" (UID: \"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e\") " Dec 09 11:45:12 crc kubenswrapper[4745]: I1209 11:45:12.375572 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-bundle\") pod \"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e\" (UID: \"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e\") " Dec 09 11:45:12 crc kubenswrapper[4745]: I1209 11:45:12.375634 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-util\") pod \"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e\" (UID: \"4ce7d845-3d2d-409d-be8e-8dd3687cbc4e\") " Dec 09 11:45:12 crc kubenswrapper[4745]: I1209 11:45:12.389088 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-bundle" (OuterVolumeSpecName: "bundle") pod "4ce7d845-3d2d-409d-be8e-8dd3687cbc4e" (UID: "4ce7d845-3d2d-409d-be8e-8dd3687cbc4e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:45:12 crc kubenswrapper[4745]: I1209 11:45:12.390972 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-util" (OuterVolumeSpecName: "util") pod "4ce7d845-3d2d-409d-be8e-8dd3687cbc4e" (UID: "4ce7d845-3d2d-409d-be8e-8dd3687cbc4e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:45:12 crc kubenswrapper[4745]: I1209 11:45:12.414607 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-kube-api-access-cslnf" (OuterVolumeSpecName: "kube-api-access-cslnf") pod "4ce7d845-3d2d-409d-be8e-8dd3687cbc4e" (UID: "4ce7d845-3d2d-409d-be8e-8dd3687cbc4e"). InnerVolumeSpecName "kube-api-access-cslnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:45:12 crc kubenswrapper[4745]: I1209 11:45:12.477317 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cslnf\" (UniqueName: \"kubernetes.io/projected/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-kube-api-access-cslnf\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:12 crc kubenswrapper[4745]: I1209 11:45:12.477360 4745 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:12 crc kubenswrapper[4745]: I1209 11:45:12.477369 4745 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ce7d845-3d2d-409d-be8e-8dd3687cbc4e-util\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:15 crc kubenswrapper[4745]: I1209 11:45:15.296543 4745 generic.go:334] "Generic (PLEG): container finished" podID="05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1" containerID="0d7b8f05c903ffebf9ee8a1ac189598d22e0f5415063962574443a38c9b9e27d" exitCode=0 Dec 09 11:45:15 crc kubenswrapper[4745]: I1209 11:45:15.296638 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fldk8" event={"ID":"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1","Type":"ContainerDied","Data":"0d7b8f05c903ffebf9ee8a1ac189598d22e0f5415063962574443a38c9b9e27d"} Dec 09 11:45:15 crc kubenswrapper[4745]: I1209 11:45:15.413700 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-8pkjt"] Dec 09 11:45:15 crc kubenswrapper[4745]: E1209 11:45:15.414017 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce7d845-3d2d-409d-be8e-8dd3687cbc4e" containerName="extract" Dec 09 11:45:15 crc kubenswrapper[4745]: I1209 11:45:15.414040 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce7d845-3d2d-409d-be8e-8dd3687cbc4e" containerName="extract" Dec 09 11:45:15 crc kubenswrapper[4745]: E1209 11:45:15.414094 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce7d845-3d2d-409d-be8e-8dd3687cbc4e" containerName="pull" Dec 09 11:45:15 crc kubenswrapper[4745]: I1209 11:45:15.414104 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce7d845-3d2d-409d-be8e-8dd3687cbc4e" containerName="pull" Dec 09 11:45:15 crc kubenswrapper[4745]: E1209 11:45:15.414119 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce7d845-3d2d-409d-be8e-8dd3687cbc4e" containerName="util" Dec 09 11:45:15 crc kubenswrapper[4745]: I1209 11:45:15.414126 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce7d845-3d2d-409d-be8e-8dd3687cbc4e" containerName="util" Dec 09 11:45:15 crc kubenswrapper[4745]: I1209 11:45:15.414230 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce7d845-3d2d-409d-be8e-8dd3687cbc4e" containerName="extract" Dec 09 11:45:15 crc kubenswrapper[4745]: I1209 11:45:15.414742 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-8pkjt" Dec 09 11:45:15 crc kubenswrapper[4745]: I1209 11:45:15.419056 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 09 11:45:15 crc kubenswrapper[4745]: I1209 11:45:15.419283 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-8cdv8" Dec 09 11:45:15 crc kubenswrapper[4745]: I1209 11:45:15.419295 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 09 11:45:15 crc kubenswrapper[4745]: I1209 11:45:15.428597 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-8pkjt"] Dec 09 11:45:15 crc kubenswrapper[4745]: I1209 11:45:15.516342 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbsf7\" (UniqueName: \"kubernetes.io/projected/b673e416-f871-4af2-a7ff-d0265a163ba1-kube-api-access-vbsf7\") pod \"nmstate-operator-5b5b58f5c8-8pkjt\" (UID: \"b673e416-f871-4af2-a7ff-d0265a163ba1\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-8pkjt" Dec 09 11:45:15 crc kubenswrapper[4745]: I1209 11:45:15.618058 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbsf7\" (UniqueName: \"kubernetes.io/projected/b673e416-f871-4af2-a7ff-d0265a163ba1-kube-api-access-vbsf7\") pod \"nmstate-operator-5b5b58f5c8-8pkjt\" (UID: \"b673e416-f871-4af2-a7ff-d0265a163ba1\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-8pkjt" Dec 09 11:45:15 crc kubenswrapper[4745]: I1209 11:45:15.635285 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbsf7\" (UniqueName: \"kubernetes.io/projected/b673e416-f871-4af2-a7ff-d0265a163ba1-kube-api-access-vbsf7\") pod \"nmstate-operator-5b5b58f5c8-8pkjt\" (UID: \"b673e416-f871-4af2-a7ff-d0265a163ba1\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-8pkjt" Dec 09 11:45:15 crc kubenswrapper[4745]: I1209 11:45:15.786291 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-8pkjt" Dec 09 11:45:16 crc kubenswrapper[4745]: I1209 11:45:16.146326 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-8pkjt"] Dec 09 11:45:16 crc kubenswrapper[4745]: W1209 11:45:16.151417 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb673e416_f871_4af2_a7ff_d0265a163ba1.slice/crio-4afa6ddbe582b695878540092d089592fe9d78398de809478fb7aa68a3f25c62 WatchSource:0}: Error finding container 4afa6ddbe582b695878540092d089592fe9d78398de809478fb7aa68a3f25c62: Status 404 returned error can't find the container with id 4afa6ddbe582b695878540092d089592fe9d78398de809478fb7aa68a3f25c62 Dec 09 11:45:16 crc kubenswrapper[4745]: I1209 11:45:16.303219 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-8pkjt" event={"ID":"b673e416-f871-4af2-a7ff-d0265a163ba1","Type":"ContainerStarted","Data":"4afa6ddbe582b695878540092d089592fe9d78398de809478fb7aa68a3f25c62"} Dec 09 11:45:18 crc kubenswrapper[4745]: I1209 11:45:18.319071 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fldk8" event={"ID":"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1","Type":"ContainerStarted","Data":"b483bd27d37a5bebac97e130d8499078e6d5c987cd9eb823cb31e68c084ee4fc"} Dec 09 11:45:18 crc kubenswrapper[4745]: I1209 11:45:18.341642 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fldk8" podStartSLOduration=2.971640229 podStartE2EDuration="11.341620417s" podCreationTimestamp="2025-12-09 11:45:07 +0000 UTC" firstStartedPulling="2025-12-09 11:45:09.167148567 +0000 UTC m=+795.992350091" lastFinishedPulling="2025-12-09 11:45:17.537128755 +0000 UTC m=+804.362330279" observedRunningTime="2025-12-09 11:45:18.339259603 +0000 UTC m=+805.164461137" watchObservedRunningTime="2025-12-09 11:45:18.341620417 +0000 UTC m=+805.166821941" Dec 09 11:45:24 crc kubenswrapper[4745]: I1209 11:45:24.455499 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-8pkjt" event={"ID":"b673e416-f871-4af2-a7ff-d0265a163ba1","Type":"ContainerStarted","Data":"264e689bc83042a0d95681298d1f75c3a95335cbe1f1be1077d85cfdeb6b8934"} Dec 09 11:45:24 crc kubenswrapper[4745]: I1209 11:45:24.476862 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-8pkjt" podStartSLOduration=1.8839917640000001 podStartE2EDuration="9.476840895s" podCreationTimestamp="2025-12-09 11:45:15 +0000 UTC" firstStartedPulling="2025-12-09 11:45:16.154091828 +0000 UTC m=+802.979293352" lastFinishedPulling="2025-12-09 11:45:23.746940959 +0000 UTC m=+810.572142483" observedRunningTime="2025-12-09 11:45:24.473211908 +0000 UTC m=+811.298413442" watchObservedRunningTime="2025-12-09 11:45:24.476840895 +0000 UTC m=+811.302042419" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.687926 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69"] Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.689033 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.691648 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-czjtr" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.697326 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.700085 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-bt5cb"] Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.701607 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bt5cb" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.703864 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-t5h2t"] Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.704699 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-t5h2t" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.708573 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-bt5cb"] Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.711492 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69"] Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.764155 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b21bd812-2c99-405f-8ddf-cbca3e8a7c7c-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-vxv69\" (UID: \"b21bd812-2c99-405f-8ddf-cbca3e8a7c7c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.764436 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z7tp\" (UniqueName: \"kubernetes.io/projected/b21bd812-2c99-405f-8ddf-cbca3e8a7c7c-kube-api-access-6z7tp\") pod \"nmstate-webhook-5f6d4c5ccb-vxv69\" (UID: \"b21bd812-2c99-405f-8ddf-cbca3e8a7c7c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.837022 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk"] Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.838176 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.842118 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-rt8rv" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.849288 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.849284 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.852324 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk"] Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.865356 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b21bd812-2c99-405f-8ddf-cbca3e8a7c7c-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-vxv69\" (UID: \"b21bd812-2c99-405f-8ddf-cbca3e8a7c7c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.865409 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbdc4\" (UniqueName: \"kubernetes.io/projected/8d3b8d0a-0403-4e6b-bb04-1dcd0c7f91cc-kube-api-access-pbdc4\") pod \"nmstate-metrics-7f946cbc9-bt5cb\" (UID: \"8d3b8d0a-0403-4e6b-bb04-1dcd0c7f91cc\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bt5cb" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.865443 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/bb9420a5-03fd-46c5-a340-3e09aaf95935-dbus-socket\") pod \"nmstate-handler-t5h2t\" (UID: \"bb9420a5-03fd-46c5-a340-3e09aaf95935\") " pod="openshift-nmstate/nmstate-handler-t5h2t" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.865472 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/bb9420a5-03fd-46c5-a340-3e09aaf95935-nmstate-lock\") pod \"nmstate-handler-t5h2t\" (UID: \"bb9420a5-03fd-46c5-a340-3e09aaf95935\") " pod="openshift-nmstate/nmstate-handler-t5h2t" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.865495 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/bb9420a5-03fd-46c5-a340-3e09aaf95935-ovs-socket\") pod \"nmstate-handler-t5h2t\" (UID: \"bb9420a5-03fd-46c5-a340-3e09aaf95935\") " pod="openshift-nmstate/nmstate-handler-t5h2t" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.865579 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cdtr\" (UniqueName: \"kubernetes.io/projected/bb9420a5-03fd-46c5-a340-3e09aaf95935-kube-api-access-8cdtr\") pod \"nmstate-handler-t5h2t\" (UID: \"bb9420a5-03fd-46c5-a340-3e09aaf95935\") " pod="openshift-nmstate/nmstate-handler-t5h2t" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.865616 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z7tp\" (UniqueName: \"kubernetes.io/projected/b21bd812-2c99-405f-8ddf-cbca3e8a7c7c-kube-api-access-6z7tp\") pod \"nmstate-webhook-5f6d4c5ccb-vxv69\" (UID: \"b21bd812-2c99-405f-8ddf-cbca3e8a7c7c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69" Dec 09 11:45:25 crc kubenswrapper[4745]: E1209 11:45:25.865968 4745 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 09 11:45:25 crc kubenswrapper[4745]: E1209 11:45:25.866026 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b21bd812-2c99-405f-8ddf-cbca3e8a7c7c-tls-key-pair podName:b21bd812-2c99-405f-8ddf-cbca3e8a7c7c nodeName:}" failed. No retries permitted until 2025-12-09 11:45:26.366006818 +0000 UTC m=+813.191208352 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/b21bd812-2c99-405f-8ddf-cbca3e8a7c7c-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-vxv69" (UID: "b21bd812-2c99-405f-8ddf-cbca3e8a7c7c") : secret "openshift-nmstate-webhook" not found Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.887529 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z7tp\" (UniqueName: \"kubernetes.io/projected/b21bd812-2c99-405f-8ddf-cbca3e8a7c7c-kube-api-access-6z7tp\") pod \"nmstate-webhook-5f6d4c5ccb-vxv69\" (UID: \"b21bd812-2c99-405f-8ddf-cbca3e8a7c7c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.967155 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbdc4\" (UniqueName: \"kubernetes.io/projected/8d3b8d0a-0403-4e6b-bb04-1dcd0c7f91cc-kube-api-access-pbdc4\") pod \"nmstate-metrics-7f946cbc9-bt5cb\" (UID: \"8d3b8d0a-0403-4e6b-bb04-1dcd0c7f91cc\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bt5cb" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.967213 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/bb9420a5-03fd-46c5-a340-3e09aaf95935-dbus-socket\") pod \"nmstate-handler-t5h2t\" (UID: \"bb9420a5-03fd-46c5-a340-3e09aaf95935\") " pod="openshift-nmstate/nmstate-handler-t5h2t" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.967249 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/bb9420a5-03fd-46c5-a340-3e09aaf95935-nmstate-lock\") pod \"nmstate-handler-t5h2t\" (UID: \"bb9420a5-03fd-46c5-a340-3e09aaf95935\") " pod="openshift-nmstate/nmstate-handler-t5h2t" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.967273 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/bb9420a5-03fd-46c5-a340-3e09aaf95935-ovs-socket\") pod \"nmstate-handler-t5h2t\" (UID: \"bb9420a5-03fd-46c5-a340-3e09aaf95935\") " pod="openshift-nmstate/nmstate-handler-t5h2t" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.967319 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-f8fhk\" (UID: \"26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.967417 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqv7v\" (UniqueName: \"kubernetes.io/projected/26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c-kube-api-access-vqv7v\") pod \"nmstate-console-plugin-7fbb5f6569-f8fhk\" (UID: \"26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.967452 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cdtr\" (UniqueName: \"kubernetes.io/projected/bb9420a5-03fd-46c5-a340-3e09aaf95935-kube-api-access-8cdtr\") pod \"nmstate-handler-t5h2t\" (UID: \"bb9420a5-03fd-46c5-a340-3e09aaf95935\") " pod="openshift-nmstate/nmstate-handler-t5h2t" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.967478 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-f8fhk\" (UID: \"26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.968152 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/bb9420a5-03fd-46c5-a340-3e09aaf95935-dbus-socket\") pod \"nmstate-handler-t5h2t\" (UID: \"bb9420a5-03fd-46c5-a340-3e09aaf95935\") " pod="openshift-nmstate/nmstate-handler-t5h2t" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.968204 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/bb9420a5-03fd-46c5-a340-3e09aaf95935-nmstate-lock\") pod \"nmstate-handler-t5h2t\" (UID: \"bb9420a5-03fd-46c5-a340-3e09aaf95935\") " pod="openshift-nmstate/nmstate-handler-t5h2t" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.968240 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/bb9420a5-03fd-46c5-a340-3e09aaf95935-ovs-socket\") pod \"nmstate-handler-t5h2t\" (UID: \"bb9420a5-03fd-46c5-a340-3e09aaf95935\") " pod="openshift-nmstate/nmstate-handler-t5h2t" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.986151 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cdtr\" (UniqueName: \"kubernetes.io/projected/bb9420a5-03fd-46c5-a340-3e09aaf95935-kube-api-access-8cdtr\") pod \"nmstate-handler-t5h2t\" (UID: \"bb9420a5-03fd-46c5-a340-3e09aaf95935\") " pod="openshift-nmstate/nmstate-handler-t5h2t" Dec 09 11:45:25 crc kubenswrapper[4745]: I1209 11:45:25.997272 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbdc4\" (UniqueName: \"kubernetes.io/projected/8d3b8d0a-0403-4e6b-bb04-1dcd0c7f91cc-kube-api-access-pbdc4\") pod \"nmstate-metrics-7f946cbc9-bt5cb\" (UID: \"8d3b8d0a-0403-4e6b-bb04-1dcd0c7f91cc\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bt5cb" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.025986 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bt5cb" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.036100 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-t5h2t" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.037483 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-577cc8c899-4jhc9"] Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.039726 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.056021 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-577cc8c899-4jhc9"] Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.069075 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-f8fhk\" (UID: \"26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.069495 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqv7v\" (UniqueName: \"kubernetes.io/projected/26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c-kube-api-access-vqv7v\") pod \"nmstate-console-plugin-7fbb5f6569-f8fhk\" (UID: \"26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.069552 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-f8fhk\" (UID: \"26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.070586 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-f8fhk\" (UID: \"26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk" Dec 09 11:45:26 crc kubenswrapper[4745]: E1209 11:45:26.070650 4745 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 09 11:45:26 crc kubenswrapper[4745]: E1209 11:45:26.070705 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c-plugin-serving-cert podName:26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c nodeName:}" failed. No retries permitted until 2025-12-09 11:45:26.570690981 +0000 UTC m=+813.395892505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-f8fhk" (UID: "26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c") : secret "plugin-serving-cert" not found Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.222417 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a448240b-5884-43b9-9f82-aefed5dcda2c-service-ca\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.222493 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a448240b-5884-43b9-9f82-aefed5dcda2c-trusted-ca-bundle\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.222569 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a448240b-5884-43b9-9f82-aefed5dcda2c-console-serving-cert\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.222605 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a448240b-5884-43b9-9f82-aefed5dcda2c-oauth-serving-cert\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.222624 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a448240b-5884-43b9-9f82-aefed5dcda2c-console-config\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.222670 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a448240b-5884-43b9-9f82-aefed5dcda2c-console-oauth-config\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.222718 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2crs\" (UniqueName: \"kubernetes.io/projected/a448240b-5884-43b9-9f82-aefed5dcda2c-kube-api-access-b2crs\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.296647 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqv7v\" (UniqueName: \"kubernetes.io/projected/26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c-kube-api-access-vqv7v\") pod \"nmstate-console-plugin-7fbb5f6569-f8fhk\" (UID: \"26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.325076 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2crs\" (UniqueName: \"kubernetes.io/projected/a448240b-5884-43b9-9f82-aefed5dcda2c-kube-api-access-b2crs\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.325526 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a448240b-5884-43b9-9f82-aefed5dcda2c-service-ca\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.325671 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a448240b-5884-43b9-9f82-aefed5dcda2c-trusted-ca-bundle\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.325807 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a448240b-5884-43b9-9f82-aefed5dcda2c-console-serving-cert\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.325941 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a448240b-5884-43b9-9f82-aefed5dcda2c-oauth-serving-cert\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.326044 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a448240b-5884-43b9-9f82-aefed5dcda2c-console-config\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.326140 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a448240b-5884-43b9-9f82-aefed5dcda2c-console-oauth-config\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.328252 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a448240b-5884-43b9-9f82-aefed5dcda2c-trusted-ca-bundle\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.329330 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a448240b-5884-43b9-9f82-aefed5dcda2c-console-oauth-config\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.330034 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a448240b-5884-43b9-9f82-aefed5dcda2c-service-ca\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.330260 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a448240b-5884-43b9-9f82-aefed5dcda2c-oauth-serving-cert\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.330275 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a448240b-5884-43b9-9f82-aefed5dcda2c-console-config\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.335434 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a448240b-5884-43b9-9f82-aefed5dcda2c-console-serving-cert\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.345268 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2crs\" (UniqueName: \"kubernetes.io/projected/a448240b-5884-43b9-9f82-aefed5dcda2c-kube-api-access-b2crs\") pod \"console-577cc8c899-4jhc9\" (UID: \"a448240b-5884-43b9-9f82-aefed5dcda2c\") " pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.408079 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.428036 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b21bd812-2c99-405f-8ddf-cbca3e8a7c7c-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-vxv69\" (UID: \"b21bd812-2c99-405f-8ddf-cbca3e8a7c7c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.431583 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b21bd812-2c99-405f-8ddf-cbca3e8a7c7c-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-vxv69\" (UID: \"b21bd812-2c99-405f-8ddf-cbca3e8a7c7c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.471001 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-t5h2t" event={"ID":"bb9420a5-03fd-46c5-a340-3e09aaf95935","Type":"ContainerStarted","Data":"ebfee926908e04fe4e8fcecdff6985e83883a773e068939d4c8581100055e952"} Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.618894 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.630646 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-f8fhk\" (UID: \"26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.636269 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-f8fhk\" (UID: \"26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.755489 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk" Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.786988 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-bt5cb"] Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.879799 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-577cc8c899-4jhc9"] Dec 09 11:45:26 crc kubenswrapper[4745]: I1209 11:45:26.983749 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69"] Dec 09 11:45:27 crc kubenswrapper[4745]: I1209 11:45:27.031479 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk"] Dec 09 11:45:27 crc kubenswrapper[4745]: W1209 11:45:27.044142 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26a2ae6b_2e87_4fc8_bc5e_60e2f605f38c.slice/crio-c9a8b08e1d3008e6b3b2cd4e86bf86da4a8911ef0313cd3d0672f1b5415b86de WatchSource:0}: Error finding container c9a8b08e1d3008e6b3b2cd4e86bf86da4a8911ef0313cd3d0672f1b5415b86de: Status 404 returned error can't find the container with id c9a8b08e1d3008e6b3b2cd4e86bf86da4a8911ef0313cd3d0672f1b5415b86de Dec 09 11:45:27 crc kubenswrapper[4745]: I1209 11:45:27.482251 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk" event={"ID":"26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c","Type":"ContainerStarted","Data":"c9a8b08e1d3008e6b3b2cd4e86bf86da4a8911ef0313cd3d0672f1b5415b86de"} Dec 09 11:45:27 crc kubenswrapper[4745]: I1209 11:45:27.483488 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bt5cb" event={"ID":"8d3b8d0a-0403-4e6b-bb04-1dcd0c7f91cc","Type":"ContainerStarted","Data":"63515723534eb126de617338ebe6f0b9e9ff69348bb03f890053399571900fa0"} Dec 09 11:45:27 crc kubenswrapper[4745]: I1209 11:45:27.484645 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69" event={"ID":"b21bd812-2c99-405f-8ddf-cbca3e8a7c7c","Type":"ContainerStarted","Data":"eb46963ad288f27e2ae95805c2c5a68b02a0c48fe9b2229c58bfd3f9d59a29cb"} Dec 09 11:45:27 crc kubenswrapper[4745]: I1209 11:45:27.486259 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-577cc8c899-4jhc9" event={"ID":"a448240b-5884-43b9-9f82-aefed5dcda2c","Type":"ContainerStarted","Data":"4ad677f621b1bb670a7bc4080299f7347679236148d45b4709e9eb781d8d4643"} Dec 09 11:45:27 crc kubenswrapper[4745]: I1209 11:45:27.486285 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-577cc8c899-4jhc9" event={"ID":"a448240b-5884-43b9-9f82-aefed5dcda2c","Type":"ContainerStarted","Data":"334eee8d6c0543e91ce7783b5c5a422ff494c2ce1c0bea23c7315a7a53c89db0"} Dec 09 11:45:27 crc kubenswrapper[4745]: I1209 11:45:27.509138 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-577cc8c899-4jhc9" podStartSLOduration=1.509110289 podStartE2EDuration="1.509110289s" podCreationTimestamp="2025-12-09 11:45:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:45:27.503880669 +0000 UTC m=+814.329082193" watchObservedRunningTime="2025-12-09 11:45:27.509110289 +0000 UTC m=+814.334311813" Dec 09 11:45:27 crc kubenswrapper[4745]: I1209 11:45:27.709462 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:27 crc kubenswrapper[4745]: I1209 11:45:27.709573 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:27 crc kubenswrapper[4745]: I1209 11:45:27.827832 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:28 crc kubenswrapper[4745]: I1209 11:45:28.624933 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:28 crc kubenswrapper[4745]: I1209 11:45:28.676376 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fldk8"] Dec 09 11:45:29 crc kubenswrapper[4745]: I1209 11:45:29.507882 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bt5cb" event={"ID":"8d3b8d0a-0403-4e6b-bb04-1dcd0c7f91cc","Type":"ContainerStarted","Data":"535726fb98dbd2eb5ae6d38a5758dee52a3c2f2d266d3c15e973ad999674f112"} Dec 09 11:45:29 crc kubenswrapper[4745]: I1209 11:45:29.510140 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69" event={"ID":"b21bd812-2c99-405f-8ddf-cbca3e8a7c7c","Type":"ContainerStarted","Data":"5fcce146676db9770a5662df884f1eaed6de74c5771018102c3ea6c155c970b2"} Dec 09 11:45:29 crc kubenswrapper[4745]: I1209 11:45:29.511419 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69" Dec 09 11:45:29 crc kubenswrapper[4745]: I1209 11:45:29.530119 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69" podStartSLOduration=2.296144392 podStartE2EDuration="4.53010008s" podCreationTimestamp="2025-12-09 11:45:25 +0000 UTC" firstStartedPulling="2025-12-09 11:45:27.003902785 +0000 UTC m=+813.829104309" lastFinishedPulling="2025-12-09 11:45:29.237858473 +0000 UTC m=+816.063059997" observedRunningTime="2025-12-09 11:45:29.527141041 +0000 UTC m=+816.352342585" watchObservedRunningTime="2025-12-09 11:45:29.53010008 +0000 UTC m=+816.355301604" Dec 09 11:45:30 crc kubenswrapper[4745]: I1209 11:45:30.527140 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-t5h2t" event={"ID":"bb9420a5-03fd-46c5-a340-3e09aaf95935","Type":"ContainerStarted","Data":"5e09d93b50d9bf54e7b85e28e94d4d67993266418a4e15d1aec98f0a7008941a"} Dec 09 11:45:30 crc kubenswrapper[4745]: I1209 11:45:30.527426 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fldk8" podUID="05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1" containerName="registry-server" containerID="cri-o://b483bd27d37a5bebac97e130d8499078e6d5c987cd9eb823cb31e68c084ee4fc" gracePeriod=2 Dec 09 11:45:30 crc kubenswrapper[4745]: I1209 11:45:30.528773 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-t5h2t" Dec 09 11:45:30 crc kubenswrapper[4745]: I1209 11:45:30.555064 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-t5h2t" podStartSLOduration=2.409380436 podStartE2EDuration="5.55504328s" podCreationTimestamp="2025-12-09 11:45:25 +0000 UTC" firstStartedPulling="2025-12-09 11:45:26.091210943 +0000 UTC m=+812.916412467" lastFinishedPulling="2025-12-09 11:45:29.236873787 +0000 UTC m=+816.062075311" observedRunningTime="2025-12-09 11:45:30.554438914 +0000 UTC m=+817.379640448" watchObservedRunningTime="2025-12-09 11:45:30.55504328 +0000 UTC m=+817.380244794" Dec 09 11:45:30 crc kubenswrapper[4745]: I1209 11:45:30.943931 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.137246 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmt6x\" (UniqueName: \"kubernetes.io/projected/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-kube-api-access-wmt6x\") pod \"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1\" (UID: \"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1\") " Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.137430 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-utilities\") pod \"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1\" (UID: \"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1\") " Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.137459 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-catalog-content\") pod \"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1\" (UID: \"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1\") " Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.138877 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-utilities" (OuterVolumeSpecName: "utilities") pod "05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1" (UID: "05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.141243 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-kube-api-access-wmt6x" (OuterVolumeSpecName: "kube-api-access-wmt6x") pod "05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1" (UID: "05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1"). InnerVolumeSpecName "kube-api-access-wmt6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.238897 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.238935 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmt6x\" (UniqueName: \"kubernetes.io/projected/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-kube-api-access-wmt6x\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.249271 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1" (UID: "05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.340905 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.537144 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk" event={"ID":"26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c","Type":"ContainerStarted","Data":"bdb5ab2c9b2c4e8c0a7d663576f41af4c82cbf9739fcff392a2738d69ca54272"} Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.539346 4745 generic.go:334] "Generic (PLEG): container finished" podID="05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1" containerID="b483bd27d37a5bebac97e130d8499078e6d5c987cd9eb823cb31e68c084ee4fc" exitCode=0 Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.539782 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fldk8" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.540694 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fldk8" event={"ID":"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1","Type":"ContainerDied","Data":"b483bd27d37a5bebac97e130d8499078e6d5c987cd9eb823cb31e68c084ee4fc"} Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.540760 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fldk8" event={"ID":"05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1","Type":"ContainerDied","Data":"ec234fae0827f56b3d3cf5d52063362523ba4ffc98b4b47767fcef3faddde997"} Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.540783 4745 scope.go:117] "RemoveContainer" containerID="b483bd27d37a5bebac97e130d8499078e6d5c987cd9eb823cb31e68c084ee4fc" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.552887 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f8fhk" podStartSLOduration=3.077358699 podStartE2EDuration="6.552869881s" podCreationTimestamp="2025-12-09 11:45:25 +0000 UTC" firstStartedPulling="2025-12-09 11:45:27.046295495 +0000 UTC m=+813.871497019" lastFinishedPulling="2025-12-09 11:45:30.521806677 +0000 UTC m=+817.347008201" observedRunningTime="2025-12-09 11:45:31.551228447 +0000 UTC m=+818.376429991" watchObservedRunningTime="2025-12-09 11:45:31.552869881 +0000 UTC m=+818.378071405" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.577993 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fldk8"] Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.585453 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fldk8"] Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.742987 4745 scope.go:117] "RemoveContainer" containerID="0d7b8f05c903ffebf9ee8a1ac189598d22e0f5415063962574443a38c9b9e27d" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.765904 4745 scope.go:117] "RemoveContainer" containerID="2ed633d574bb380a80b2be0a5ab4437aa8c457b87f8570e4244f7f127e5477a6" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.785979 4745 scope.go:117] "RemoveContainer" containerID="b483bd27d37a5bebac97e130d8499078e6d5c987cd9eb823cb31e68c084ee4fc" Dec 09 11:45:31 crc kubenswrapper[4745]: E1209 11:45:31.786732 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b483bd27d37a5bebac97e130d8499078e6d5c987cd9eb823cb31e68c084ee4fc\": container with ID starting with b483bd27d37a5bebac97e130d8499078e6d5c987cd9eb823cb31e68c084ee4fc not found: ID does not exist" containerID="b483bd27d37a5bebac97e130d8499078e6d5c987cd9eb823cb31e68c084ee4fc" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.786802 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b483bd27d37a5bebac97e130d8499078e6d5c987cd9eb823cb31e68c084ee4fc"} err="failed to get container status \"b483bd27d37a5bebac97e130d8499078e6d5c987cd9eb823cb31e68c084ee4fc\": rpc error: code = NotFound desc = could not find container \"b483bd27d37a5bebac97e130d8499078e6d5c987cd9eb823cb31e68c084ee4fc\": container with ID starting with b483bd27d37a5bebac97e130d8499078e6d5c987cd9eb823cb31e68c084ee4fc not found: ID does not exist" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.786837 4745 scope.go:117] "RemoveContainer" containerID="0d7b8f05c903ffebf9ee8a1ac189598d22e0f5415063962574443a38c9b9e27d" Dec 09 11:45:31 crc kubenswrapper[4745]: E1209 11:45:31.787924 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d7b8f05c903ffebf9ee8a1ac189598d22e0f5415063962574443a38c9b9e27d\": container with ID starting with 0d7b8f05c903ffebf9ee8a1ac189598d22e0f5415063962574443a38c9b9e27d not found: ID does not exist" containerID="0d7b8f05c903ffebf9ee8a1ac189598d22e0f5415063962574443a38c9b9e27d" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.787952 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d7b8f05c903ffebf9ee8a1ac189598d22e0f5415063962574443a38c9b9e27d"} err="failed to get container status \"0d7b8f05c903ffebf9ee8a1ac189598d22e0f5415063962574443a38c9b9e27d\": rpc error: code = NotFound desc = could not find container \"0d7b8f05c903ffebf9ee8a1ac189598d22e0f5415063962574443a38c9b9e27d\": container with ID starting with 0d7b8f05c903ffebf9ee8a1ac189598d22e0f5415063962574443a38c9b9e27d not found: ID does not exist" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.787985 4745 scope.go:117] "RemoveContainer" containerID="2ed633d574bb380a80b2be0a5ab4437aa8c457b87f8570e4244f7f127e5477a6" Dec 09 11:45:31 crc kubenswrapper[4745]: E1209 11:45:31.788749 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ed633d574bb380a80b2be0a5ab4437aa8c457b87f8570e4244f7f127e5477a6\": container with ID starting with 2ed633d574bb380a80b2be0a5ab4437aa8c457b87f8570e4244f7f127e5477a6 not found: ID does not exist" containerID="2ed633d574bb380a80b2be0a5ab4437aa8c457b87f8570e4244f7f127e5477a6" Dec 09 11:45:31 crc kubenswrapper[4745]: I1209 11:45:31.788778 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ed633d574bb380a80b2be0a5ab4437aa8c457b87f8570e4244f7f127e5477a6"} err="failed to get container status \"2ed633d574bb380a80b2be0a5ab4437aa8c457b87f8570e4244f7f127e5477a6\": rpc error: code = NotFound desc = could not find container \"2ed633d574bb380a80b2be0a5ab4437aa8c457b87f8570e4244f7f127e5477a6\": container with ID starting with 2ed633d574bb380a80b2be0a5ab4437aa8c457b87f8570e4244f7f127e5477a6 not found: ID does not exist" Dec 09 11:45:32 crc kubenswrapper[4745]: I1209 11:45:32.547147 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bt5cb" event={"ID":"8d3b8d0a-0403-4e6b-bb04-1dcd0c7f91cc","Type":"ContainerStarted","Data":"45823577f7d771841ee5865fe5f25022058f5958573a047a6674b7f34b3b222b"} Dec 09 11:45:32 crc kubenswrapper[4745]: I1209 11:45:32.568321 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bt5cb" podStartSLOduration=2.528192783 podStartE2EDuration="7.568301485s" podCreationTimestamp="2025-12-09 11:45:25 +0000 UTC" firstStartedPulling="2025-12-09 11:45:26.80178261 +0000 UTC m=+813.626984124" lastFinishedPulling="2025-12-09 11:45:31.841891302 +0000 UTC m=+818.667092826" observedRunningTime="2025-12-09 11:45:32.562249022 +0000 UTC m=+819.387450546" watchObservedRunningTime="2025-12-09 11:45:32.568301485 +0000 UTC m=+819.393503009" Dec 09 11:45:33 crc kubenswrapper[4745]: I1209 11:45:33.560492 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1" path="/var/lib/kubelet/pods/05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1/volumes" Dec 09 11:45:36 crc kubenswrapper[4745]: I1209 11:45:36.057878 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-t5h2t" Dec 09 11:45:36 crc kubenswrapper[4745]: I1209 11:45:36.409161 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:36 crc kubenswrapper[4745]: I1209 11:45:36.409245 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:36 crc kubenswrapper[4745]: I1209 11:45:36.416246 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:36 crc kubenswrapper[4745]: I1209 11:45:36.572870 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-577cc8c899-4jhc9" Dec 09 11:45:36 crc kubenswrapper[4745]: I1209 11:45:36.632438 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-mcrk5"] Dec 09 11:45:46 crc kubenswrapper[4745]: I1209 11:45:46.626737 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vxv69" Dec 09 11:45:59 crc kubenswrapper[4745]: I1209 11:45:59.863140 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm"] Dec 09 11:45:59 crc kubenswrapper[4745]: E1209 11:45:59.863783 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1" containerName="extract-utilities" Dec 09 11:45:59 crc kubenswrapper[4745]: I1209 11:45:59.863797 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1" containerName="extract-utilities" Dec 09 11:45:59 crc kubenswrapper[4745]: E1209 11:45:59.863817 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1" containerName="extract-content" Dec 09 11:45:59 crc kubenswrapper[4745]: I1209 11:45:59.863823 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1" containerName="extract-content" Dec 09 11:45:59 crc kubenswrapper[4745]: E1209 11:45:59.863832 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1" containerName="registry-server" Dec 09 11:45:59 crc kubenswrapper[4745]: I1209 11:45:59.863838 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1" containerName="registry-server" Dec 09 11:45:59 crc kubenswrapper[4745]: I1209 11:45:59.863933 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="05cd5c7c-eb47-43f0-a13d-84f9da6ba5e1" containerName="registry-server" Dec 09 11:45:59 crc kubenswrapper[4745]: I1209 11:45:59.864690 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" Dec 09 11:45:59 crc kubenswrapper[4745]: I1209 11:45:59.866727 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 11:45:59 crc kubenswrapper[4745]: I1209 11:45:59.869665 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm"] Dec 09 11:46:00 crc kubenswrapper[4745]: I1209 11:46:00.036914 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw5cs\" (UniqueName: \"kubernetes.io/projected/51904623-2e7d-4a4d-a614-c916a8039fe9-kube-api-access-kw5cs\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm\" (UID: \"51904623-2e7d-4a4d-a614-c916a8039fe9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" Dec 09 11:46:00 crc kubenswrapper[4745]: I1209 11:46:00.036991 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51904623-2e7d-4a4d-a614-c916a8039fe9-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm\" (UID: \"51904623-2e7d-4a4d-a614-c916a8039fe9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" Dec 09 11:46:00 crc kubenswrapper[4745]: I1209 11:46:00.037060 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51904623-2e7d-4a4d-a614-c916a8039fe9-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm\" (UID: \"51904623-2e7d-4a4d-a614-c916a8039fe9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" Dec 09 11:46:00 crc kubenswrapper[4745]: I1209 11:46:00.137789 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw5cs\" (UniqueName: \"kubernetes.io/projected/51904623-2e7d-4a4d-a614-c916a8039fe9-kube-api-access-kw5cs\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm\" (UID: \"51904623-2e7d-4a4d-a614-c916a8039fe9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" Dec 09 11:46:00 crc kubenswrapper[4745]: I1209 11:46:00.137865 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51904623-2e7d-4a4d-a614-c916a8039fe9-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm\" (UID: \"51904623-2e7d-4a4d-a614-c916a8039fe9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" Dec 09 11:46:00 crc kubenswrapper[4745]: I1209 11:46:00.137891 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51904623-2e7d-4a4d-a614-c916a8039fe9-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm\" (UID: \"51904623-2e7d-4a4d-a614-c916a8039fe9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" Dec 09 11:46:00 crc kubenswrapper[4745]: I1209 11:46:00.138434 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51904623-2e7d-4a4d-a614-c916a8039fe9-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm\" (UID: \"51904623-2e7d-4a4d-a614-c916a8039fe9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" Dec 09 11:46:00 crc kubenswrapper[4745]: I1209 11:46:00.138437 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51904623-2e7d-4a4d-a614-c916a8039fe9-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm\" (UID: \"51904623-2e7d-4a4d-a614-c916a8039fe9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" Dec 09 11:46:00 crc kubenswrapper[4745]: I1209 11:46:00.161218 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw5cs\" (UniqueName: \"kubernetes.io/projected/51904623-2e7d-4a4d-a614-c916a8039fe9-kube-api-access-kw5cs\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm\" (UID: \"51904623-2e7d-4a4d-a614-c916a8039fe9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" Dec 09 11:46:00 crc kubenswrapper[4745]: I1209 11:46:00.182120 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" Dec 09 11:46:00 crc kubenswrapper[4745]: I1209 11:46:00.602341 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm"] Dec 09 11:46:00 crc kubenswrapper[4745]: I1209 11:46:00.734225 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" event={"ID":"51904623-2e7d-4a4d-a614-c916a8039fe9","Type":"ContainerStarted","Data":"f3c339fe3f66a1fb57ab77da1260f68fadcee7381bb5a2619482f0437c45d258"} Dec 09 11:46:01 crc kubenswrapper[4745]: I1209 11:46:01.676670 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-mcrk5" podUID="8c51afdf-fde4-4147-814c-8befb1ad7d1f" containerName="console" containerID="cri-o://f72139f30c178972a682e877c9b5413f131fc035475f920ff9649dd60a51e8bb" gracePeriod=15 Dec 09 11:46:01 crc kubenswrapper[4745]: I1209 11:46:01.742316 4745 generic.go:334] "Generic (PLEG): container finished" podID="51904623-2e7d-4a4d-a614-c916a8039fe9" containerID="61d56b75d870dbffbc49119906b223f4cefe1a836065ec7b38a3e4400948f8b9" exitCode=0 Dec 09 11:46:01 crc kubenswrapper[4745]: I1209 11:46:01.742433 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" event={"ID":"51904623-2e7d-4a4d-a614-c916a8039fe9","Type":"ContainerDied","Data":"61d56b75d870dbffbc49119906b223f4cefe1a836065ec7b38a3e4400948f8b9"} Dec 09 11:46:01 crc kubenswrapper[4745]: I1209 11:46:01.797734 4745 patch_prober.go:28] interesting pod/console-f9d7485db-mcrk5 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 09 11:46:01 crc kubenswrapper[4745]: I1209 11:46:01.797779 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-mcrk5" podUID="8c51afdf-fde4-4147-814c-8befb1ad7d1f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 09 11:46:01 crc kubenswrapper[4745]: I1209 11:46:01.993407 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-mcrk5_8c51afdf-fde4-4147-814c-8befb1ad7d1f/console/0.log" Dec 09 11:46:01 crc kubenswrapper[4745]: I1209 11:46:01.993472 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.163612 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-trusted-ca-bundle\") pod \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.163672 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-config\") pod \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.163726 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-serving-cert\") pod \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.163763 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsqhp\" (UniqueName: \"kubernetes.io/projected/8c51afdf-fde4-4147-814c-8befb1ad7d1f-kube-api-access-dsqhp\") pod \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.163778 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-oauth-config\") pod \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.163795 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-service-ca\") pod \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.163864 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-oauth-serving-cert\") pod \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\" (UID: \"8c51afdf-fde4-4147-814c-8befb1ad7d1f\") " Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.164781 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8c51afdf-fde4-4147-814c-8befb1ad7d1f" (UID: "8c51afdf-fde4-4147-814c-8befb1ad7d1f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.165177 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8c51afdf-fde4-4147-814c-8befb1ad7d1f" (UID: "8c51afdf-fde4-4147-814c-8befb1ad7d1f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.165483 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-config" (OuterVolumeSpecName: "console-config") pod "8c51afdf-fde4-4147-814c-8befb1ad7d1f" (UID: "8c51afdf-fde4-4147-814c-8befb1ad7d1f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.166893 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-service-ca" (OuterVolumeSpecName: "service-ca") pod "8c51afdf-fde4-4147-814c-8befb1ad7d1f" (UID: "8c51afdf-fde4-4147-814c-8befb1ad7d1f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.171714 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8c51afdf-fde4-4147-814c-8befb1ad7d1f" (UID: "8c51afdf-fde4-4147-814c-8befb1ad7d1f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.171978 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c51afdf-fde4-4147-814c-8befb1ad7d1f-kube-api-access-dsqhp" (OuterVolumeSpecName: "kube-api-access-dsqhp") pod "8c51afdf-fde4-4147-814c-8befb1ad7d1f" (UID: "8c51afdf-fde4-4147-814c-8befb1ad7d1f"). InnerVolumeSpecName "kube-api-access-dsqhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.172182 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8c51afdf-fde4-4147-814c-8befb1ad7d1f" (UID: "8c51afdf-fde4-4147-814c-8befb1ad7d1f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.265316 4745 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.265354 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.265374 4745 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.265385 4745 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.265395 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsqhp\" (UniqueName: \"kubernetes.io/projected/8c51afdf-fde4-4147-814c-8befb1ad7d1f-kube-api-access-dsqhp\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.265407 4745 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c51afdf-fde4-4147-814c-8befb1ad7d1f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.265417 4745 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c51afdf-fde4-4147-814c-8befb1ad7d1f-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.751281 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-mcrk5_8c51afdf-fde4-4147-814c-8befb1ad7d1f/console/0.log" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.751335 4745 generic.go:334] "Generic (PLEG): container finished" podID="8c51afdf-fde4-4147-814c-8befb1ad7d1f" containerID="f72139f30c178972a682e877c9b5413f131fc035475f920ff9649dd60a51e8bb" exitCode=2 Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.751368 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mcrk5" event={"ID":"8c51afdf-fde4-4147-814c-8befb1ad7d1f","Type":"ContainerDied","Data":"f72139f30c178972a682e877c9b5413f131fc035475f920ff9649dd60a51e8bb"} Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.751397 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mcrk5" event={"ID":"8c51afdf-fde4-4147-814c-8befb1ad7d1f","Type":"ContainerDied","Data":"24d3d49c7a056eaf925cd7d7b7ae7acb5bd88131679859401a3faa6c7e8a50e1"} Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.751416 4745 scope.go:117] "RemoveContainer" containerID="f72139f30c178972a682e877c9b5413f131fc035475f920ff9649dd60a51e8bb" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.751419 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mcrk5" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.770506 4745 scope.go:117] "RemoveContainer" containerID="f72139f30c178972a682e877c9b5413f131fc035475f920ff9649dd60a51e8bb" Dec 09 11:46:02 crc kubenswrapper[4745]: E1209 11:46:02.771442 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f72139f30c178972a682e877c9b5413f131fc035475f920ff9649dd60a51e8bb\": container with ID starting with f72139f30c178972a682e877c9b5413f131fc035475f920ff9649dd60a51e8bb not found: ID does not exist" containerID="f72139f30c178972a682e877c9b5413f131fc035475f920ff9649dd60a51e8bb" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.771499 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f72139f30c178972a682e877c9b5413f131fc035475f920ff9649dd60a51e8bb"} err="failed to get container status \"f72139f30c178972a682e877c9b5413f131fc035475f920ff9649dd60a51e8bb\": rpc error: code = NotFound desc = could not find container \"f72139f30c178972a682e877c9b5413f131fc035475f920ff9649dd60a51e8bb\": container with ID starting with f72139f30c178972a682e877c9b5413f131fc035475f920ff9649dd60a51e8bb not found: ID does not exist" Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.785100 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-mcrk5"] Dec 09 11:46:02 crc kubenswrapper[4745]: I1209 11:46:02.788684 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-mcrk5"] Dec 09 11:46:03 crc kubenswrapper[4745]: I1209 11:46:03.562723 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c51afdf-fde4-4147-814c-8befb1ad7d1f" path="/var/lib/kubelet/pods/8c51afdf-fde4-4147-814c-8befb1ad7d1f/volumes" Dec 09 11:46:07 crc kubenswrapper[4745]: I1209 11:46:07.782533 4745 generic.go:334] "Generic (PLEG): container finished" podID="51904623-2e7d-4a4d-a614-c916a8039fe9" containerID="0c6220429612e562cad19f10ef48804a7769088e3bdd2b25cfa9a149c22ca165" exitCode=0 Dec 09 11:46:07 crc kubenswrapper[4745]: I1209 11:46:07.782638 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" event={"ID":"51904623-2e7d-4a4d-a614-c916a8039fe9","Type":"ContainerDied","Data":"0c6220429612e562cad19f10ef48804a7769088e3bdd2b25cfa9a149c22ca165"} Dec 09 11:46:08 crc kubenswrapper[4745]: I1209 11:46:08.791739 4745 generic.go:334] "Generic (PLEG): container finished" podID="51904623-2e7d-4a4d-a614-c916a8039fe9" containerID="8b2b7c4e92c211f60a6df856f150bd16f2e7f0c841320206726063063b4c7e1e" exitCode=0 Dec 09 11:46:08 crc kubenswrapper[4745]: I1209 11:46:08.792005 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" event={"ID":"51904623-2e7d-4a4d-a614-c916a8039fe9","Type":"ContainerDied","Data":"8b2b7c4e92c211f60a6df856f150bd16f2e7f0c841320206726063063b4c7e1e"} Dec 09 11:46:10 crc kubenswrapper[4745]: I1209 11:46:10.010228 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" Dec 09 11:46:10 crc kubenswrapper[4745]: I1209 11:46:10.168600 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw5cs\" (UniqueName: \"kubernetes.io/projected/51904623-2e7d-4a4d-a614-c916a8039fe9-kube-api-access-kw5cs\") pod \"51904623-2e7d-4a4d-a614-c916a8039fe9\" (UID: \"51904623-2e7d-4a4d-a614-c916a8039fe9\") " Dec 09 11:46:10 crc kubenswrapper[4745]: I1209 11:46:10.168726 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51904623-2e7d-4a4d-a614-c916a8039fe9-bundle\") pod \"51904623-2e7d-4a4d-a614-c916a8039fe9\" (UID: \"51904623-2e7d-4a4d-a614-c916a8039fe9\") " Dec 09 11:46:10 crc kubenswrapper[4745]: I1209 11:46:10.168783 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51904623-2e7d-4a4d-a614-c916a8039fe9-util\") pod \"51904623-2e7d-4a4d-a614-c916a8039fe9\" (UID: \"51904623-2e7d-4a4d-a614-c916a8039fe9\") " Dec 09 11:46:10 crc kubenswrapper[4745]: I1209 11:46:10.170469 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51904623-2e7d-4a4d-a614-c916a8039fe9-bundle" (OuterVolumeSpecName: "bundle") pod "51904623-2e7d-4a4d-a614-c916a8039fe9" (UID: "51904623-2e7d-4a4d-a614-c916a8039fe9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:46:10 crc kubenswrapper[4745]: I1209 11:46:10.174238 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51904623-2e7d-4a4d-a614-c916a8039fe9-kube-api-access-kw5cs" (OuterVolumeSpecName: "kube-api-access-kw5cs") pod "51904623-2e7d-4a4d-a614-c916a8039fe9" (UID: "51904623-2e7d-4a4d-a614-c916a8039fe9"). InnerVolumeSpecName "kube-api-access-kw5cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:46:10 crc kubenswrapper[4745]: I1209 11:46:10.179091 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51904623-2e7d-4a4d-a614-c916a8039fe9-util" (OuterVolumeSpecName: "util") pod "51904623-2e7d-4a4d-a614-c916a8039fe9" (UID: "51904623-2e7d-4a4d-a614-c916a8039fe9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:46:10 crc kubenswrapper[4745]: I1209 11:46:10.270250 4745 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51904623-2e7d-4a4d-a614-c916a8039fe9-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:10 crc kubenswrapper[4745]: I1209 11:46:10.270299 4745 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51904623-2e7d-4a4d-a614-c916a8039fe9-util\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:10 crc kubenswrapper[4745]: I1209 11:46:10.270309 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw5cs\" (UniqueName: \"kubernetes.io/projected/51904623-2e7d-4a4d-a614-c916a8039fe9-kube-api-access-kw5cs\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:10 crc kubenswrapper[4745]: I1209 11:46:10.804609 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" event={"ID":"51904623-2e7d-4a4d-a614-c916a8039fe9","Type":"ContainerDied","Data":"f3c339fe3f66a1fb57ab77da1260f68fadcee7381bb5a2619482f0437c45d258"} Dec 09 11:46:10 crc kubenswrapper[4745]: I1209 11:46:10.804650 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3c339fe3f66a1fb57ab77da1260f68fadcee7381bb5a2619482f0437c45d258" Dec 09 11:46:10 crc kubenswrapper[4745]: I1209 11:46:10.804688 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm" Dec 09 11:46:22 crc kubenswrapper[4745]: I1209 11:46:22.950943 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4"] Dec 09 11:46:22 crc kubenswrapper[4745]: E1209 11:46:22.951751 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51904623-2e7d-4a4d-a614-c916a8039fe9" containerName="pull" Dec 09 11:46:22 crc kubenswrapper[4745]: I1209 11:46:22.951767 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="51904623-2e7d-4a4d-a614-c916a8039fe9" containerName="pull" Dec 09 11:46:22 crc kubenswrapper[4745]: E1209 11:46:22.951780 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51904623-2e7d-4a4d-a614-c916a8039fe9" containerName="util" Dec 09 11:46:22 crc kubenswrapper[4745]: I1209 11:46:22.951788 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="51904623-2e7d-4a4d-a614-c916a8039fe9" containerName="util" Dec 09 11:46:22 crc kubenswrapper[4745]: E1209 11:46:22.951799 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51904623-2e7d-4a4d-a614-c916a8039fe9" containerName="extract" Dec 09 11:46:22 crc kubenswrapper[4745]: I1209 11:46:22.951806 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="51904623-2e7d-4a4d-a614-c916a8039fe9" containerName="extract" Dec 09 11:46:22 crc kubenswrapper[4745]: E1209 11:46:22.951817 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c51afdf-fde4-4147-814c-8befb1ad7d1f" containerName="console" Dec 09 11:46:22 crc kubenswrapper[4745]: I1209 11:46:22.951824 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c51afdf-fde4-4147-814c-8befb1ad7d1f" containerName="console" Dec 09 11:46:22 crc kubenswrapper[4745]: I1209 11:46:22.951943 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c51afdf-fde4-4147-814c-8befb1ad7d1f" containerName="console" Dec 09 11:46:22 crc kubenswrapper[4745]: I1209 11:46:22.951961 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="51904623-2e7d-4a4d-a614-c916a8039fe9" containerName="extract" Dec 09 11:46:22 crc kubenswrapper[4745]: I1209 11:46:22.952454 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4" Dec 09 11:46:22 crc kubenswrapper[4745]: I1209 11:46:22.954409 4745 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 09 11:46:22 crc kubenswrapper[4745]: I1209 11:46:22.954583 4745 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 09 11:46:22 crc kubenswrapper[4745]: I1209 11:46:22.954685 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 09 11:46:22 crc kubenswrapper[4745]: I1209 11:46:22.955136 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 09 11:46:22 crc kubenswrapper[4745]: I1209 11:46:22.957049 4745 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-p5qvr" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.003858 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4"] Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.049936 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2f004c8-bb75-40ea-8377-fd965d8b8efa-apiservice-cert\") pod \"metallb-operator-controller-manager-58c594ff54-8s4t4\" (UID: \"b2f004c8-bb75-40ea-8377-fd965d8b8efa\") " pod="metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.050027 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2f004c8-bb75-40ea-8377-fd965d8b8efa-webhook-cert\") pod \"metallb-operator-controller-manager-58c594ff54-8s4t4\" (UID: \"b2f004c8-bb75-40ea-8377-fd965d8b8efa\") " pod="metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.050102 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmbfh\" (UniqueName: \"kubernetes.io/projected/b2f004c8-bb75-40ea-8377-fd965d8b8efa-kube-api-access-kmbfh\") pod \"metallb-operator-controller-manager-58c594ff54-8s4t4\" (UID: \"b2f004c8-bb75-40ea-8377-fd965d8b8efa\") " pod="metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.151060 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2f004c8-bb75-40ea-8377-fd965d8b8efa-apiservice-cert\") pod \"metallb-operator-controller-manager-58c594ff54-8s4t4\" (UID: \"b2f004c8-bb75-40ea-8377-fd965d8b8efa\") " pod="metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.151157 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2f004c8-bb75-40ea-8377-fd965d8b8efa-webhook-cert\") pod \"metallb-operator-controller-manager-58c594ff54-8s4t4\" (UID: \"b2f004c8-bb75-40ea-8377-fd965d8b8efa\") " pod="metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.151205 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmbfh\" (UniqueName: \"kubernetes.io/projected/b2f004c8-bb75-40ea-8377-fd965d8b8efa-kube-api-access-kmbfh\") pod \"metallb-operator-controller-manager-58c594ff54-8s4t4\" (UID: \"b2f004c8-bb75-40ea-8377-fd965d8b8efa\") " pod="metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.159442 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2f004c8-bb75-40ea-8377-fd965d8b8efa-apiservice-cert\") pod \"metallb-operator-controller-manager-58c594ff54-8s4t4\" (UID: \"b2f004c8-bb75-40ea-8377-fd965d8b8efa\") " pod="metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.173201 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2f004c8-bb75-40ea-8377-fd965d8b8efa-webhook-cert\") pod \"metallb-operator-controller-manager-58c594ff54-8s4t4\" (UID: \"b2f004c8-bb75-40ea-8377-fd965d8b8efa\") " pod="metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.178169 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmbfh\" (UniqueName: \"kubernetes.io/projected/b2f004c8-bb75-40ea-8377-fd965d8b8efa-kube-api-access-kmbfh\") pod \"metallb-operator-controller-manager-58c594ff54-8s4t4\" (UID: \"b2f004c8-bb75-40ea-8377-fd965d8b8efa\") " pod="metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.270280 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.327274 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr"] Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.327998 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.329869 4745 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.330379 4745 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-grqlt" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.331148 4745 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.354435 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr"] Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.454278 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5lg9\" (UniqueName: \"kubernetes.io/projected/0d69da0b-6b01-4d22-b874-b43c308d712e-kube-api-access-z5lg9\") pod \"metallb-operator-webhook-server-69fc6547dc-jm8rr\" (UID: \"0d69da0b-6b01-4d22-b874-b43c308d712e\") " pod="metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.454725 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d69da0b-6b01-4d22-b874-b43c308d712e-apiservice-cert\") pod \"metallb-operator-webhook-server-69fc6547dc-jm8rr\" (UID: \"0d69da0b-6b01-4d22-b874-b43c308d712e\") " pod="metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.454765 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d69da0b-6b01-4d22-b874-b43c308d712e-webhook-cert\") pod \"metallb-operator-webhook-server-69fc6547dc-jm8rr\" (UID: \"0d69da0b-6b01-4d22-b874-b43c308d712e\") " pod="metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.556156 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d69da0b-6b01-4d22-b874-b43c308d712e-apiservice-cert\") pod \"metallb-operator-webhook-server-69fc6547dc-jm8rr\" (UID: \"0d69da0b-6b01-4d22-b874-b43c308d712e\") " pod="metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.556230 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d69da0b-6b01-4d22-b874-b43c308d712e-webhook-cert\") pod \"metallb-operator-webhook-server-69fc6547dc-jm8rr\" (UID: \"0d69da0b-6b01-4d22-b874-b43c308d712e\") " pod="metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.556289 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5lg9\" (UniqueName: \"kubernetes.io/projected/0d69da0b-6b01-4d22-b874-b43c308d712e-kube-api-access-z5lg9\") pod \"metallb-operator-webhook-server-69fc6547dc-jm8rr\" (UID: \"0d69da0b-6b01-4d22-b874-b43c308d712e\") " pod="metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.563070 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d69da0b-6b01-4d22-b874-b43c308d712e-webhook-cert\") pod \"metallb-operator-webhook-server-69fc6547dc-jm8rr\" (UID: \"0d69da0b-6b01-4d22-b874-b43c308d712e\") " pod="metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.566064 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d69da0b-6b01-4d22-b874-b43c308d712e-apiservice-cert\") pod \"metallb-operator-webhook-server-69fc6547dc-jm8rr\" (UID: \"0d69da0b-6b01-4d22-b874-b43c308d712e\") " pod="metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.587922 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5lg9\" (UniqueName: \"kubernetes.io/projected/0d69da0b-6b01-4d22-b874-b43c308d712e-kube-api-access-z5lg9\") pod \"metallb-operator-webhook-server-69fc6547dc-jm8rr\" (UID: \"0d69da0b-6b01-4d22-b874-b43c308d712e\") " pod="metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.600901 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4"] Dec 09 11:46:23 crc kubenswrapper[4745]: W1209 11:46:23.608368 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2f004c8_bb75_40ea_8377_fd965d8b8efa.slice/crio-e83cf1902000986702078268a76344d13565882d7c792637a68bd0ac141a5364 WatchSource:0}: Error finding container e83cf1902000986702078268a76344d13565882d7c792637a68bd0ac141a5364: Status 404 returned error can't find the container with id e83cf1902000986702078268a76344d13565882d7c792637a68bd0ac141a5364 Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.643646 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr" Dec 09 11:46:23 crc kubenswrapper[4745]: I1209 11:46:23.877462 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4" event={"ID":"b2f004c8-bb75-40ea-8377-fd965d8b8efa","Type":"ContainerStarted","Data":"e83cf1902000986702078268a76344d13565882d7c792637a68bd0ac141a5364"} Dec 09 11:46:24 crc kubenswrapper[4745]: I1209 11:46:24.113366 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr"] Dec 09 11:46:24 crc kubenswrapper[4745]: W1209 11:46:24.120689 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d69da0b_6b01_4d22_b874_b43c308d712e.slice/crio-02f1552d7865fb92b140f12d20f423888b989b685a8763ceadca9aae417c71fd WatchSource:0}: Error finding container 02f1552d7865fb92b140f12d20f423888b989b685a8763ceadca9aae417c71fd: Status 404 returned error can't find the container with id 02f1552d7865fb92b140f12d20f423888b989b685a8763ceadca9aae417c71fd Dec 09 11:46:24 crc kubenswrapper[4745]: I1209 11:46:24.884274 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr" event={"ID":"0d69da0b-6b01-4d22-b874-b43c308d712e","Type":"ContainerStarted","Data":"02f1552d7865fb92b140f12d20f423888b989b685a8763ceadca9aae417c71fd"} Dec 09 11:46:27 crc kubenswrapper[4745]: I1209 11:46:27.903340 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4" event={"ID":"b2f004c8-bb75-40ea-8377-fd965d8b8efa","Type":"ContainerStarted","Data":"5a753e288a7ae33cc338921756c79751fe1b982c2f1a163705b010d562ed7538"} Dec 09 11:46:27 crc kubenswrapper[4745]: I1209 11:46:27.903703 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4" Dec 09 11:46:27 crc kubenswrapper[4745]: I1209 11:46:27.924044 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4" podStartSLOduration=2.552461223 podStartE2EDuration="5.924023849s" podCreationTimestamp="2025-12-09 11:46:22 +0000 UTC" firstStartedPulling="2025-12-09 11:46:23.611128576 +0000 UTC m=+870.436330100" lastFinishedPulling="2025-12-09 11:46:26.982691202 +0000 UTC m=+873.807892726" observedRunningTime="2025-12-09 11:46:27.921898702 +0000 UTC m=+874.747100226" watchObservedRunningTime="2025-12-09 11:46:27.924023849 +0000 UTC m=+874.749225453" Dec 09 11:46:30 crc kubenswrapper[4745]: I1209 11:46:30.923907 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr" event={"ID":"0d69da0b-6b01-4d22-b874-b43c308d712e","Type":"ContainerStarted","Data":"2a63679bb080b0b3eb8310efb05e1bc4dd19f4be872a917f689be34ff447b673"} Dec 09 11:46:30 crc kubenswrapper[4745]: I1209 11:46:30.924256 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr" Dec 09 11:46:30 crc kubenswrapper[4745]: I1209 11:46:30.944228 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr" podStartSLOduration=1.414765488 podStartE2EDuration="7.944205824s" podCreationTimestamp="2025-12-09 11:46:23 +0000 UTC" firstStartedPulling="2025-12-09 11:46:24.123106525 +0000 UTC m=+870.948308049" lastFinishedPulling="2025-12-09 11:46:30.652546861 +0000 UTC m=+877.477748385" observedRunningTime="2025-12-09 11:46:30.942329914 +0000 UTC m=+877.767531438" watchObservedRunningTime="2025-12-09 11:46:30.944205824 +0000 UTC m=+877.769407348" Dec 09 11:46:31 crc kubenswrapper[4745]: I1209 11:46:31.841643 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nfr7d"] Dec 09 11:46:31 crc kubenswrapper[4745]: I1209 11:46:31.843101 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:31 crc kubenswrapper[4745]: I1209 11:46:31.857250 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nfr7d"] Dec 09 11:46:31 crc kubenswrapper[4745]: I1209 11:46:31.923263 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn8n2\" (UniqueName: \"kubernetes.io/projected/9f93d682-e8b1-49c9-8971-a95c65e6748a-kube-api-access-dn8n2\") pod \"community-operators-nfr7d\" (UID: \"9f93d682-e8b1-49c9-8971-a95c65e6748a\") " pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:31 crc kubenswrapper[4745]: I1209 11:46:31.923328 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f93d682-e8b1-49c9-8971-a95c65e6748a-utilities\") pod \"community-operators-nfr7d\" (UID: \"9f93d682-e8b1-49c9-8971-a95c65e6748a\") " pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:31 crc kubenswrapper[4745]: I1209 11:46:31.923362 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f93d682-e8b1-49c9-8971-a95c65e6748a-catalog-content\") pod \"community-operators-nfr7d\" (UID: \"9f93d682-e8b1-49c9-8971-a95c65e6748a\") " pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:32 crc kubenswrapper[4745]: I1209 11:46:32.024195 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn8n2\" (UniqueName: \"kubernetes.io/projected/9f93d682-e8b1-49c9-8971-a95c65e6748a-kube-api-access-dn8n2\") pod \"community-operators-nfr7d\" (UID: \"9f93d682-e8b1-49c9-8971-a95c65e6748a\") " pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:32 crc kubenswrapper[4745]: I1209 11:46:32.024827 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f93d682-e8b1-49c9-8971-a95c65e6748a-utilities\") pod \"community-operators-nfr7d\" (UID: \"9f93d682-e8b1-49c9-8971-a95c65e6748a\") " pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:32 crc kubenswrapper[4745]: I1209 11:46:32.024921 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f93d682-e8b1-49c9-8971-a95c65e6748a-catalog-content\") pod \"community-operators-nfr7d\" (UID: \"9f93d682-e8b1-49c9-8971-a95c65e6748a\") " pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:32 crc kubenswrapper[4745]: I1209 11:46:32.025444 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f93d682-e8b1-49c9-8971-a95c65e6748a-utilities\") pod \"community-operators-nfr7d\" (UID: \"9f93d682-e8b1-49c9-8971-a95c65e6748a\") " pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:32 crc kubenswrapper[4745]: I1209 11:46:32.025591 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f93d682-e8b1-49c9-8971-a95c65e6748a-catalog-content\") pod \"community-operators-nfr7d\" (UID: \"9f93d682-e8b1-49c9-8971-a95c65e6748a\") " pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:32 crc kubenswrapper[4745]: I1209 11:46:32.047666 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn8n2\" (UniqueName: \"kubernetes.io/projected/9f93d682-e8b1-49c9-8971-a95c65e6748a-kube-api-access-dn8n2\") pod \"community-operators-nfr7d\" (UID: \"9f93d682-e8b1-49c9-8971-a95c65e6748a\") " pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:32 crc kubenswrapper[4745]: I1209 11:46:32.157710 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:32 crc kubenswrapper[4745]: I1209 11:46:32.879799 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nfr7d"] Dec 09 11:46:32 crc kubenswrapper[4745]: I1209 11:46:32.948659 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfr7d" event={"ID":"9f93d682-e8b1-49c9-8971-a95c65e6748a","Type":"ContainerStarted","Data":"96d78405c5658006ed5e75f0f70cb8ff1777c9174e78ac9c6480e981d0790d95"} Dec 09 11:46:34 crc kubenswrapper[4745]: E1209 11:46:34.545067 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f93d682_e8b1_49c9_8971_a95c65e6748a.slice/crio-conmon-8e1e56f40927395518b94d354e9d0038b6ba04b46166dabd9702b3ab108d1979.scope\": RecentStats: unable to find data in memory cache]" Dec 09 11:46:34 crc kubenswrapper[4745]: I1209 11:46:34.962688 4745 generic.go:334] "Generic (PLEG): container finished" podID="9f93d682-e8b1-49c9-8971-a95c65e6748a" containerID="8e1e56f40927395518b94d354e9d0038b6ba04b46166dabd9702b3ab108d1979" exitCode=0 Dec 09 11:46:34 crc kubenswrapper[4745]: I1209 11:46:34.962735 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfr7d" event={"ID":"9f93d682-e8b1-49c9-8971-a95c65e6748a","Type":"ContainerDied","Data":"8e1e56f40927395518b94d354e9d0038b6ba04b46166dabd9702b3ab108d1979"} Dec 09 11:46:35 crc kubenswrapper[4745]: I1209 11:46:35.971256 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfr7d" event={"ID":"9f93d682-e8b1-49c9-8971-a95c65e6748a","Type":"ContainerStarted","Data":"10f7e309489c665332a83a4e622839a7a358cd524e5aab35905b8e2b2ce37cfb"} Dec 09 11:46:36 crc kubenswrapper[4745]: I1209 11:46:36.977777 4745 generic.go:334] "Generic (PLEG): container finished" podID="9f93d682-e8b1-49c9-8971-a95c65e6748a" containerID="10f7e309489c665332a83a4e622839a7a358cd524e5aab35905b8e2b2ce37cfb" exitCode=0 Dec 09 11:46:36 crc kubenswrapper[4745]: I1209 11:46:36.977830 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfr7d" event={"ID":"9f93d682-e8b1-49c9-8971-a95c65e6748a","Type":"ContainerDied","Data":"10f7e309489c665332a83a4e622839a7a358cd524e5aab35905b8e2b2ce37cfb"} Dec 09 11:46:41 crc kubenswrapper[4745]: I1209 11:46:41.001256 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfr7d" event={"ID":"9f93d682-e8b1-49c9-8971-a95c65e6748a","Type":"ContainerStarted","Data":"6ca413415d136b97bb3fcaf088ded8ae0c0f22d04334210b773cf1eee137f44e"} Dec 09 11:46:41 crc kubenswrapper[4745]: I1209 11:46:41.022689 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nfr7d" podStartSLOduration=4.707088977 podStartE2EDuration="10.022661617s" podCreationTimestamp="2025-12-09 11:46:31 +0000 UTC" firstStartedPulling="2025-12-09 11:46:34.964227999 +0000 UTC m=+881.789429523" lastFinishedPulling="2025-12-09 11:46:40.279800639 +0000 UTC m=+887.105002163" observedRunningTime="2025-12-09 11:46:41.016433829 +0000 UTC m=+887.841635353" watchObservedRunningTime="2025-12-09 11:46:41.022661617 +0000 UTC m=+887.847863161" Dec 09 11:46:42 crc kubenswrapper[4745]: I1209 11:46:42.158422 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:42 crc kubenswrapper[4745]: I1209 11:46:42.158480 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:43 crc kubenswrapper[4745]: I1209 11:46:43.195809 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-nfr7d" podUID="9f93d682-e8b1-49c9-8971-a95c65e6748a" containerName="registry-server" probeResult="failure" output=< Dec 09 11:46:43 crc kubenswrapper[4745]: timeout: failed to connect service ":50051" within 1s Dec 09 11:46:43 crc kubenswrapper[4745]: > Dec 09 11:46:43 crc kubenswrapper[4745]: I1209 11:46:43.224696 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4xbtd"] Dec 09 11:46:43 crc kubenswrapper[4745]: I1209 11:46:43.226115 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:46:43 crc kubenswrapper[4745]: I1209 11:46:43.235800 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xbtd"] Dec 09 11:46:43 crc kubenswrapper[4745]: I1209 11:46:43.285683 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0acc446b-fe2e-40dd-8e91-9c62addbfe66-catalog-content\") pod \"redhat-marketplace-4xbtd\" (UID: \"0acc446b-fe2e-40dd-8e91-9c62addbfe66\") " pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:46:43 crc kubenswrapper[4745]: I1209 11:46:43.285778 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0acc446b-fe2e-40dd-8e91-9c62addbfe66-utilities\") pod \"redhat-marketplace-4xbtd\" (UID: \"0acc446b-fe2e-40dd-8e91-9c62addbfe66\") " pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:46:43 crc kubenswrapper[4745]: I1209 11:46:43.285827 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4gc4\" (UniqueName: \"kubernetes.io/projected/0acc446b-fe2e-40dd-8e91-9c62addbfe66-kube-api-access-q4gc4\") pod \"redhat-marketplace-4xbtd\" (UID: \"0acc446b-fe2e-40dd-8e91-9c62addbfe66\") " pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:46:43 crc kubenswrapper[4745]: I1209 11:46:43.386744 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0acc446b-fe2e-40dd-8e91-9c62addbfe66-catalog-content\") pod \"redhat-marketplace-4xbtd\" (UID: \"0acc446b-fe2e-40dd-8e91-9c62addbfe66\") " pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:46:43 crc kubenswrapper[4745]: I1209 11:46:43.386824 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0acc446b-fe2e-40dd-8e91-9c62addbfe66-utilities\") pod \"redhat-marketplace-4xbtd\" (UID: \"0acc446b-fe2e-40dd-8e91-9c62addbfe66\") " pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:46:43 crc kubenswrapper[4745]: I1209 11:46:43.386857 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4gc4\" (UniqueName: \"kubernetes.io/projected/0acc446b-fe2e-40dd-8e91-9c62addbfe66-kube-api-access-q4gc4\") pod \"redhat-marketplace-4xbtd\" (UID: \"0acc446b-fe2e-40dd-8e91-9c62addbfe66\") " pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:46:43 crc kubenswrapper[4745]: I1209 11:46:43.387399 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0acc446b-fe2e-40dd-8e91-9c62addbfe66-catalog-content\") pod \"redhat-marketplace-4xbtd\" (UID: \"0acc446b-fe2e-40dd-8e91-9c62addbfe66\") " pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:46:43 crc kubenswrapper[4745]: I1209 11:46:43.387468 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0acc446b-fe2e-40dd-8e91-9c62addbfe66-utilities\") pod \"redhat-marketplace-4xbtd\" (UID: \"0acc446b-fe2e-40dd-8e91-9c62addbfe66\") " pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:46:43 crc kubenswrapper[4745]: I1209 11:46:43.410168 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4gc4\" (UniqueName: \"kubernetes.io/projected/0acc446b-fe2e-40dd-8e91-9c62addbfe66-kube-api-access-q4gc4\") pod \"redhat-marketplace-4xbtd\" (UID: \"0acc446b-fe2e-40dd-8e91-9c62addbfe66\") " pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:46:43 crc kubenswrapper[4745]: I1209 11:46:43.547159 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:46:43 crc kubenswrapper[4745]: I1209 11:46:43.676716 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-69fc6547dc-jm8rr" Dec 09 11:46:43 crc kubenswrapper[4745]: I1209 11:46:43.902364 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xbtd"] Dec 09 11:46:43 crc kubenswrapper[4745]: W1209 11:46:43.928141 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0acc446b_fe2e_40dd_8e91_9c62addbfe66.slice/crio-92449c0c28f7ba7a3404d1eb67ce7b59379a4d78bdcb9047748705504e8abeba WatchSource:0}: Error finding container 92449c0c28f7ba7a3404d1eb67ce7b59379a4d78bdcb9047748705504e8abeba: Status 404 returned error can't find the container with id 92449c0c28f7ba7a3404d1eb67ce7b59379a4d78bdcb9047748705504e8abeba Dec 09 11:46:44 crc kubenswrapper[4745]: I1209 11:46:44.021395 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xbtd" event={"ID":"0acc446b-fe2e-40dd-8e91-9c62addbfe66","Type":"ContainerStarted","Data":"92449c0c28f7ba7a3404d1eb67ce7b59379a4d78bdcb9047748705504e8abeba"} Dec 09 11:46:47 crc kubenswrapper[4745]: I1209 11:46:47.040983 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xbtd" event={"ID":"0acc446b-fe2e-40dd-8e91-9c62addbfe66","Type":"ContainerStarted","Data":"2764323453d976b5ea2f52807d8ae34e8b5b926a7cfa9f828586fb0a3a77c80a"} Dec 09 11:46:48 crc kubenswrapper[4745]: I1209 11:46:48.050850 4745 generic.go:334] "Generic (PLEG): container finished" podID="0acc446b-fe2e-40dd-8e91-9c62addbfe66" containerID="2764323453d976b5ea2f52807d8ae34e8b5b926a7cfa9f828586fb0a3a77c80a" exitCode=0 Dec 09 11:46:48 crc kubenswrapper[4745]: I1209 11:46:48.050919 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xbtd" event={"ID":"0acc446b-fe2e-40dd-8e91-9c62addbfe66","Type":"ContainerDied","Data":"2764323453d976b5ea2f52807d8ae34e8b5b926a7cfa9f828586fb0a3a77c80a"} Dec 09 11:46:50 crc kubenswrapper[4745]: I1209 11:46:50.066169 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xbtd" event={"ID":"0acc446b-fe2e-40dd-8e91-9c62addbfe66","Type":"ContainerStarted","Data":"27a546cb207ab9af57757fbd389e837d957d709942d9a856926f68b56dac86ca"} Dec 09 11:46:51 crc kubenswrapper[4745]: I1209 11:46:51.073296 4745 generic.go:334] "Generic (PLEG): container finished" podID="0acc446b-fe2e-40dd-8e91-9c62addbfe66" containerID="27a546cb207ab9af57757fbd389e837d957d709942d9a856926f68b56dac86ca" exitCode=0 Dec 09 11:46:51 crc kubenswrapper[4745]: I1209 11:46:51.073344 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xbtd" event={"ID":"0acc446b-fe2e-40dd-8e91-9c62addbfe66","Type":"ContainerDied","Data":"27a546cb207ab9af57757fbd389e837d957d709942d9a856926f68b56dac86ca"} Dec 09 11:46:52 crc kubenswrapper[4745]: I1209 11:46:52.083612 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xbtd" event={"ID":"0acc446b-fe2e-40dd-8e91-9c62addbfe66","Type":"ContainerStarted","Data":"b856235c0711cd6c88d6d213defc9145701599117d2cfd92ec40863d9b767423"} Dec 09 11:46:52 crc kubenswrapper[4745]: I1209 11:46:52.102573 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4xbtd" podStartSLOduration=5.533707241 podStartE2EDuration="9.102542543s" podCreationTimestamp="2025-12-09 11:46:43 +0000 UTC" firstStartedPulling="2025-12-09 11:46:48.053192509 +0000 UTC m=+894.878394033" lastFinishedPulling="2025-12-09 11:46:51.622027811 +0000 UTC m=+898.447229335" observedRunningTime="2025-12-09 11:46:52.100043366 +0000 UTC m=+898.925244900" watchObservedRunningTime="2025-12-09 11:46:52.102542543 +0000 UTC m=+898.927744067" Dec 09 11:46:52 crc kubenswrapper[4745]: I1209 11:46:52.215797 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:52 crc kubenswrapper[4745]: I1209 11:46:52.307440 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:53 crc kubenswrapper[4745]: I1209 11:46:53.426585 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rgkpt"] Dec 09 11:46:53 crc kubenswrapper[4745]: I1209 11:46:53.427965 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:46:53 crc kubenswrapper[4745]: I1209 11:46:53.439689 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04a1d96-420e-4d4f-8c7f-bce46b811660-catalog-content\") pod \"certified-operators-rgkpt\" (UID: \"e04a1d96-420e-4d4f-8c7f-bce46b811660\") " pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:46:53 crc kubenswrapper[4745]: I1209 11:46:53.439756 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04a1d96-420e-4d4f-8c7f-bce46b811660-utilities\") pod \"certified-operators-rgkpt\" (UID: \"e04a1d96-420e-4d4f-8c7f-bce46b811660\") " pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:46:53 crc kubenswrapper[4745]: I1209 11:46:53.439903 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82fh4\" (UniqueName: \"kubernetes.io/projected/e04a1d96-420e-4d4f-8c7f-bce46b811660-kube-api-access-82fh4\") pod \"certified-operators-rgkpt\" (UID: \"e04a1d96-420e-4d4f-8c7f-bce46b811660\") " pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:46:53 crc kubenswrapper[4745]: I1209 11:46:53.440000 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rgkpt"] Dec 09 11:46:53 crc kubenswrapper[4745]: I1209 11:46:53.540828 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04a1d96-420e-4d4f-8c7f-bce46b811660-catalog-content\") pod \"certified-operators-rgkpt\" (UID: \"e04a1d96-420e-4d4f-8c7f-bce46b811660\") " pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:46:53 crc kubenswrapper[4745]: I1209 11:46:53.541052 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04a1d96-420e-4d4f-8c7f-bce46b811660-utilities\") pod \"certified-operators-rgkpt\" (UID: \"e04a1d96-420e-4d4f-8c7f-bce46b811660\") " pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:46:53 crc kubenswrapper[4745]: I1209 11:46:53.541122 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82fh4\" (UniqueName: \"kubernetes.io/projected/e04a1d96-420e-4d4f-8c7f-bce46b811660-kube-api-access-82fh4\") pod \"certified-operators-rgkpt\" (UID: \"e04a1d96-420e-4d4f-8c7f-bce46b811660\") " pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:46:53 crc kubenswrapper[4745]: I1209 11:46:53.541541 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04a1d96-420e-4d4f-8c7f-bce46b811660-catalog-content\") pod \"certified-operators-rgkpt\" (UID: \"e04a1d96-420e-4d4f-8c7f-bce46b811660\") " pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:46:53 crc kubenswrapper[4745]: I1209 11:46:53.541598 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04a1d96-420e-4d4f-8c7f-bce46b811660-utilities\") pod \"certified-operators-rgkpt\" (UID: \"e04a1d96-420e-4d4f-8c7f-bce46b811660\") " pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:46:53 crc kubenswrapper[4745]: I1209 11:46:53.548101 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:46:53 crc kubenswrapper[4745]: I1209 11:46:53.548152 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:46:53 crc kubenswrapper[4745]: I1209 11:46:53.567721 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82fh4\" (UniqueName: \"kubernetes.io/projected/e04a1d96-420e-4d4f-8c7f-bce46b811660-kube-api-access-82fh4\") pod \"certified-operators-rgkpt\" (UID: \"e04a1d96-420e-4d4f-8c7f-bce46b811660\") " pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:46:53 crc kubenswrapper[4745]: I1209 11:46:53.607610 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:46:53 crc kubenswrapper[4745]: I1209 11:46:53.745594 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:46:54 crc kubenswrapper[4745]: I1209 11:46:54.344784 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rgkpt"] Dec 09 11:46:54 crc kubenswrapper[4745]: W1209 11:46:54.347804 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode04a1d96_420e_4d4f_8c7f_bce46b811660.slice/crio-cbf26644eade3279d33395a25f9bb79b2a61797cb195777647c6da8e860e20cf WatchSource:0}: Error finding container cbf26644eade3279d33395a25f9bb79b2a61797cb195777647c6da8e860e20cf: Status 404 returned error can't find the container with id cbf26644eade3279d33395a25f9bb79b2a61797cb195777647c6da8e860e20cf Dec 09 11:46:55 crc kubenswrapper[4745]: I1209 11:46:55.115029 4745 generic.go:334] "Generic (PLEG): container finished" podID="e04a1d96-420e-4d4f-8c7f-bce46b811660" containerID="6f7c5a882ef46326d84645be3a19b05875e758e0a849fdddbddd7ebb1070ec17" exitCode=0 Dec 09 11:46:55 crc kubenswrapper[4745]: I1209 11:46:55.115141 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rgkpt" event={"ID":"e04a1d96-420e-4d4f-8c7f-bce46b811660","Type":"ContainerDied","Data":"6f7c5a882ef46326d84645be3a19b05875e758e0a849fdddbddd7ebb1070ec17"} Dec 09 11:46:55 crc kubenswrapper[4745]: I1209 11:46:55.115416 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rgkpt" event={"ID":"e04a1d96-420e-4d4f-8c7f-bce46b811660","Type":"ContainerStarted","Data":"cbf26644eade3279d33395a25f9bb79b2a61797cb195777647c6da8e860e20cf"} Dec 09 11:46:55 crc kubenswrapper[4745]: I1209 11:46:55.475304 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:46:55 crc kubenswrapper[4745]: I1209 11:46:55.475366 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:46:56 crc kubenswrapper[4745]: I1209 11:46:56.617134 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nfr7d"] Dec 09 11:46:56 crc kubenswrapper[4745]: I1209 11:46:56.617436 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nfr7d" podUID="9f93d682-e8b1-49c9-8971-a95c65e6748a" containerName="registry-server" containerID="cri-o://6ca413415d136b97bb3fcaf088ded8ae0c0f22d04334210b773cf1eee137f44e" gracePeriod=2 Dec 09 11:46:57 crc kubenswrapper[4745]: I1209 11:46:57.543896 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:57 crc kubenswrapper[4745]: I1209 11:46:57.643409 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn8n2\" (UniqueName: \"kubernetes.io/projected/9f93d682-e8b1-49c9-8971-a95c65e6748a-kube-api-access-dn8n2\") pod \"9f93d682-e8b1-49c9-8971-a95c65e6748a\" (UID: \"9f93d682-e8b1-49c9-8971-a95c65e6748a\") " Dec 09 11:46:57 crc kubenswrapper[4745]: I1209 11:46:57.643500 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f93d682-e8b1-49c9-8971-a95c65e6748a-catalog-content\") pod \"9f93d682-e8b1-49c9-8971-a95c65e6748a\" (UID: \"9f93d682-e8b1-49c9-8971-a95c65e6748a\") " Dec 09 11:46:57 crc kubenswrapper[4745]: I1209 11:46:57.643610 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f93d682-e8b1-49c9-8971-a95c65e6748a-utilities\") pod \"9f93d682-e8b1-49c9-8971-a95c65e6748a\" (UID: \"9f93d682-e8b1-49c9-8971-a95c65e6748a\") " Dec 09 11:46:57 crc kubenswrapper[4745]: I1209 11:46:57.644748 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f93d682-e8b1-49c9-8971-a95c65e6748a-utilities" (OuterVolumeSpecName: "utilities") pod "9f93d682-e8b1-49c9-8971-a95c65e6748a" (UID: "9f93d682-e8b1-49c9-8971-a95c65e6748a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:46:57 crc kubenswrapper[4745]: I1209 11:46:57.653679 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f93d682-e8b1-49c9-8971-a95c65e6748a-kube-api-access-dn8n2" (OuterVolumeSpecName: "kube-api-access-dn8n2") pod "9f93d682-e8b1-49c9-8971-a95c65e6748a" (UID: "9f93d682-e8b1-49c9-8971-a95c65e6748a"). InnerVolumeSpecName "kube-api-access-dn8n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:46:57 crc kubenswrapper[4745]: I1209 11:46:57.701709 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f93d682-e8b1-49c9-8971-a95c65e6748a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f93d682-e8b1-49c9-8971-a95c65e6748a" (UID: "9f93d682-e8b1-49c9-8971-a95c65e6748a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:46:57 crc kubenswrapper[4745]: I1209 11:46:57.744756 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn8n2\" (UniqueName: \"kubernetes.io/projected/9f93d682-e8b1-49c9-8971-a95c65e6748a-kube-api-access-dn8n2\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:57 crc kubenswrapper[4745]: I1209 11:46:57.744798 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f93d682-e8b1-49c9-8971-a95c65e6748a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:57 crc kubenswrapper[4745]: I1209 11:46:57.744811 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f93d682-e8b1-49c9-8971-a95c65e6748a-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.135498 4745 generic.go:334] "Generic (PLEG): container finished" podID="9f93d682-e8b1-49c9-8971-a95c65e6748a" containerID="6ca413415d136b97bb3fcaf088ded8ae0c0f22d04334210b773cf1eee137f44e" exitCode=0 Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.135593 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfr7d" event={"ID":"9f93d682-e8b1-49c9-8971-a95c65e6748a","Type":"ContainerDied","Data":"6ca413415d136b97bb3fcaf088ded8ae0c0f22d04334210b773cf1eee137f44e"} Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.135622 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfr7d" event={"ID":"9f93d682-e8b1-49c9-8971-a95c65e6748a","Type":"ContainerDied","Data":"96d78405c5658006ed5e75f0f70cb8ff1777c9174e78ac9c6480e981d0790d95"} Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.135640 4745 scope.go:117] "RemoveContainer" containerID="6ca413415d136b97bb3fcaf088ded8ae0c0f22d04334210b773cf1eee137f44e" Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.135766 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfr7d" Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.140848 4745 generic.go:334] "Generic (PLEG): container finished" podID="e04a1d96-420e-4d4f-8c7f-bce46b811660" containerID="1e25690a6e6c94db903234bbf171492d71b2debebb6b405f4d6b3ba7d7dc4da9" exitCode=0 Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.140889 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rgkpt" event={"ID":"e04a1d96-420e-4d4f-8c7f-bce46b811660","Type":"ContainerDied","Data":"1e25690a6e6c94db903234bbf171492d71b2debebb6b405f4d6b3ba7d7dc4da9"} Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.152805 4745 scope.go:117] "RemoveContainer" containerID="10f7e309489c665332a83a4e622839a7a358cd524e5aab35905b8e2b2ce37cfb" Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.179796 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nfr7d"] Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.182892 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nfr7d"] Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.185308 4745 scope.go:117] "RemoveContainer" containerID="8e1e56f40927395518b94d354e9d0038b6ba04b46166dabd9702b3ab108d1979" Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.200455 4745 scope.go:117] "RemoveContainer" containerID="6ca413415d136b97bb3fcaf088ded8ae0c0f22d04334210b773cf1eee137f44e" Dec 09 11:46:58 crc kubenswrapper[4745]: E1209 11:46:58.201059 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ca413415d136b97bb3fcaf088ded8ae0c0f22d04334210b773cf1eee137f44e\": container with ID starting with 6ca413415d136b97bb3fcaf088ded8ae0c0f22d04334210b773cf1eee137f44e not found: ID does not exist" containerID="6ca413415d136b97bb3fcaf088ded8ae0c0f22d04334210b773cf1eee137f44e" Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.201103 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ca413415d136b97bb3fcaf088ded8ae0c0f22d04334210b773cf1eee137f44e"} err="failed to get container status \"6ca413415d136b97bb3fcaf088ded8ae0c0f22d04334210b773cf1eee137f44e\": rpc error: code = NotFound desc = could not find container \"6ca413415d136b97bb3fcaf088ded8ae0c0f22d04334210b773cf1eee137f44e\": container with ID starting with 6ca413415d136b97bb3fcaf088ded8ae0c0f22d04334210b773cf1eee137f44e not found: ID does not exist" Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.201133 4745 scope.go:117] "RemoveContainer" containerID="10f7e309489c665332a83a4e622839a7a358cd524e5aab35905b8e2b2ce37cfb" Dec 09 11:46:58 crc kubenswrapper[4745]: E1209 11:46:58.201395 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10f7e309489c665332a83a4e622839a7a358cd524e5aab35905b8e2b2ce37cfb\": container with ID starting with 10f7e309489c665332a83a4e622839a7a358cd524e5aab35905b8e2b2ce37cfb not found: ID does not exist" containerID="10f7e309489c665332a83a4e622839a7a358cd524e5aab35905b8e2b2ce37cfb" Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.201419 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10f7e309489c665332a83a4e622839a7a358cd524e5aab35905b8e2b2ce37cfb"} err="failed to get container status \"10f7e309489c665332a83a4e622839a7a358cd524e5aab35905b8e2b2ce37cfb\": rpc error: code = NotFound desc = could not find container \"10f7e309489c665332a83a4e622839a7a358cd524e5aab35905b8e2b2ce37cfb\": container with ID starting with 10f7e309489c665332a83a4e622839a7a358cd524e5aab35905b8e2b2ce37cfb not found: ID does not exist" Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.201434 4745 scope.go:117] "RemoveContainer" containerID="8e1e56f40927395518b94d354e9d0038b6ba04b46166dabd9702b3ab108d1979" Dec 09 11:46:58 crc kubenswrapper[4745]: E1209 11:46:58.201640 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e1e56f40927395518b94d354e9d0038b6ba04b46166dabd9702b3ab108d1979\": container with ID starting with 8e1e56f40927395518b94d354e9d0038b6ba04b46166dabd9702b3ab108d1979 not found: ID does not exist" containerID="8e1e56f40927395518b94d354e9d0038b6ba04b46166dabd9702b3ab108d1979" Dec 09 11:46:58 crc kubenswrapper[4745]: I1209 11:46:58.201662 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e1e56f40927395518b94d354e9d0038b6ba04b46166dabd9702b3ab108d1979"} err="failed to get container status \"8e1e56f40927395518b94d354e9d0038b6ba04b46166dabd9702b3ab108d1979\": rpc error: code = NotFound desc = could not find container \"8e1e56f40927395518b94d354e9d0038b6ba04b46166dabd9702b3ab108d1979\": container with ID starting with 8e1e56f40927395518b94d354e9d0038b6ba04b46166dabd9702b3ab108d1979 not found: ID does not exist" Dec 09 11:46:59 crc kubenswrapper[4745]: I1209 11:46:59.562863 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f93d682-e8b1-49c9-8971-a95c65e6748a" path="/var/lib/kubelet/pods/9f93d682-e8b1-49c9-8971-a95c65e6748a/volumes" Dec 09 11:47:00 crc kubenswrapper[4745]: I1209 11:47:00.158683 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rgkpt" event={"ID":"e04a1d96-420e-4d4f-8c7f-bce46b811660","Type":"ContainerStarted","Data":"b7ecbc8be57cfaad48c9691a061a57a6827d8be1a4f04dbf3267a85a71d58764"} Dec 09 11:47:00 crc kubenswrapper[4745]: I1209 11:47:00.176188 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rgkpt" podStartSLOduration=3.464217678 podStartE2EDuration="7.176171849s" podCreationTimestamp="2025-12-09 11:46:53 +0000 UTC" firstStartedPulling="2025-12-09 11:46:55.116478462 +0000 UTC m=+901.941679986" lastFinishedPulling="2025-12-09 11:46:58.828432633 +0000 UTC m=+905.653634157" observedRunningTime="2025-12-09 11:47:00.174816812 +0000 UTC m=+907.000018346" watchObservedRunningTime="2025-12-09 11:47:00.176171849 +0000 UTC m=+907.001373393" Dec 09 11:47:03 crc kubenswrapper[4745]: I1209 11:47:03.275263 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-58c594ff54-8s4t4" Dec 09 11:47:03 crc kubenswrapper[4745]: I1209 11:47:03.590323 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:47:03 crc kubenswrapper[4745]: I1209 11:47:03.745754 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:47:03 crc kubenswrapper[4745]: I1209 11:47:03.746095 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:47:03 crc kubenswrapper[4745]: I1209 11:47:03.784333 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.021097 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-2j5gd"] Dec 09 11:47:04 crc kubenswrapper[4745]: E1209 11:47:04.021611 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f93d682-e8b1-49c9-8971-a95c65e6748a" containerName="registry-server" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.021726 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f93d682-e8b1-49c9-8971-a95c65e6748a" containerName="registry-server" Dec 09 11:47:04 crc kubenswrapper[4745]: E1209 11:47:04.021805 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f93d682-e8b1-49c9-8971-a95c65e6748a" containerName="extract-utilities" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.021865 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f93d682-e8b1-49c9-8971-a95c65e6748a" containerName="extract-utilities" Dec 09 11:47:04 crc kubenswrapper[4745]: E1209 11:47:04.021952 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f93d682-e8b1-49c9-8971-a95c65e6748a" containerName="extract-content" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.022011 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f93d682-e8b1-49c9-8971-a95c65e6748a" containerName="extract-content" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.022203 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f93d682-e8b1-49c9-8971-a95c65e6748a" containerName="registry-server" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.024729 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.027261 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.027425 4745 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-cx89b" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.027693 4745 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.034236 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-fsg2c"] Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.035142 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fsg2c" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.037588 4745 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.072248 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-fsg2c"] Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.129749 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-frr-sockets\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.130213 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-metrics\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.130338 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sd4p\" (UniqueName: \"kubernetes.io/projected/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-kube-api-access-2sd4p\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.130489 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-metrics-certs\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.130661 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-frr-conf\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.130806 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-frr-startup\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.130904 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-reloader\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.153916 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-wkt54"] Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.155237 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wkt54" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.158319 4745 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.159025 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-bj6cc"] Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.160063 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-bj6cc" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.161961 4745 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-jjsg8" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.162227 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.162895 4745 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.166551 4745 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.179324 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-bj6cc"] Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.236223 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbsw9\" (UniqueName: \"kubernetes.io/projected/c6e40c28-b12b-4e80-80ab-a0d5cf254c9c-kube-api-access-sbsw9\") pod \"frr-k8s-webhook-server-7fcb986d4-fsg2c\" (UID: \"c6e40c28-b12b-4e80-80ab-a0d5cf254c9c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fsg2c" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.236284 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-frr-sockets\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.236331 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-metrics\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.236448 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sd4p\" (UniqueName: \"kubernetes.io/projected/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-kube-api-access-2sd4p\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.236477 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6e40c28-b12b-4e80-80ab-a0d5cf254c9c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-fsg2c\" (UID: \"c6e40c28-b12b-4e80-80ab-a0d5cf254c9c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fsg2c" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.236497 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-metrics-certs\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.236540 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-frr-conf\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.236571 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-frr-startup\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.236586 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-reloader\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.237017 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-reloader\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.237565 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-frr-sockets\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.237749 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-metrics\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: E1209 11:47:04.238064 4745 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 09 11:47:04 crc kubenswrapper[4745]: E1209 11:47:04.238107 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-metrics-certs podName:3144339c-91b6-4d03-ae8f-d7ba80ba67ae nodeName:}" failed. No retries permitted until 2025-12-09 11:47:04.738094335 +0000 UTC m=+911.563295859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-metrics-certs") pod "frr-k8s-2j5gd" (UID: "3144339c-91b6-4d03-ae8f-d7ba80ba67ae") : secret "frr-k8s-certs-secret" not found Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.238391 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-frr-conf\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.239186 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-frr-startup\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.253693 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.274624 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sd4p\" (UniqueName: \"kubernetes.io/projected/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-kube-api-access-2sd4p\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.338699 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54802f9b-3130-4b2c-a22c-fdbc95388e66-metrics-certs\") pod \"controller-f8648f98b-bj6cc\" (UID: \"54802f9b-3130-4b2c-a22c-fdbc95388e66\") " pod="metallb-system/controller-f8648f98b-bj6cc" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.338798 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-metrics-certs\") pod \"speaker-wkt54\" (UID: \"9ea06c36-f5da-472b-9ad1-5ab4401e89e2\") " pod="metallb-system/speaker-wkt54" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.338868 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6e40c28-b12b-4e80-80ab-a0d5cf254c9c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-fsg2c\" (UID: \"c6e40c28-b12b-4e80-80ab-a0d5cf254c9c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fsg2c" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.339045 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-memberlist\") pod \"speaker-wkt54\" (UID: \"9ea06c36-f5da-472b-9ad1-5ab4401e89e2\") " pod="metallb-system/speaker-wkt54" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.339207 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54802f9b-3130-4b2c-a22c-fdbc95388e66-cert\") pod \"controller-f8648f98b-bj6cc\" (UID: \"54802f9b-3130-4b2c-a22c-fdbc95388e66\") " pod="metallb-system/controller-f8648f98b-bj6cc" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.339496 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbfmj\" (UniqueName: \"kubernetes.io/projected/54802f9b-3130-4b2c-a22c-fdbc95388e66-kube-api-access-sbfmj\") pod \"controller-f8648f98b-bj6cc\" (UID: \"54802f9b-3130-4b2c-a22c-fdbc95388e66\") " pod="metallb-system/controller-f8648f98b-bj6cc" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.339559 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-metallb-excludel2\") pod \"speaker-wkt54\" (UID: \"9ea06c36-f5da-472b-9ad1-5ab4401e89e2\") " pod="metallb-system/speaker-wkt54" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.339587 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbsw9\" (UniqueName: \"kubernetes.io/projected/c6e40c28-b12b-4e80-80ab-a0d5cf254c9c-kube-api-access-sbsw9\") pod \"frr-k8s-webhook-server-7fcb986d4-fsg2c\" (UID: \"c6e40c28-b12b-4e80-80ab-a0d5cf254c9c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fsg2c" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.339648 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dnhb\" (UniqueName: \"kubernetes.io/projected/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-kube-api-access-2dnhb\") pod \"speaker-wkt54\" (UID: \"9ea06c36-f5da-472b-9ad1-5ab4401e89e2\") " pod="metallb-system/speaker-wkt54" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.344260 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6e40c28-b12b-4e80-80ab-a0d5cf254c9c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-fsg2c\" (UID: \"c6e40c28-b12b-4e80-80ab-a0d5cf254c9c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fsg2c" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.358227 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbsw9\" (UniqueName: \"kubernetes.io/projected/c6e40c28-b12b-4e80-80ab-a0d5cf254c9c-kube-api-access-sbsw9\") pod \"frr-k8s-webhook-server-7fcb986d4-fsg2c\" (UID: \"c6e40c28-b12b-4e80-80ab-a0d5cf254c9c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fsg2c" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.358852 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fsg2c" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.443943 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-metallb-excludel2\") pod \"speaker-wkt54\" (UID: \"9ea06c36-f5da-472b-9ad1-5ab4401e89e2\") " pod="metallb-system/speaker-wkt54" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.444044 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dnhb\" (UniqueName: \"kubernetes.io/projected/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-kube-api-access-2dnhb\") pod \"speaker-wkt54\" (UID: \"9ea06c36-f5da-472b-9ad1-5ab4401e89e2\") " pod="metallb-system/speaker-wkt54" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.444097 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54802f9b-3130-4b2c-a22c-fdbc95388e66-metrics-certs\") pod \"controller-f8648f98b-bj6cc\" (UID: \"54802f9b-3130-4b2c-a22c-fdbc95388e66\") " pod="metallb-system/controller-f8648f98b-bj6cc" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.444158 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-metrics-certs\") pod \"speaker-wkt54\" (UID: \"9ea06c36-f5da-472b-9ad1-5ab4401e89e2\") " pod="metallb-system/speaker-wkt54" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.444188 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-memberlist\") pod \"speaker-wkt54\" (UID: \"9ea06c36-f5da-472b-9ad1-5ab4401e89e2\") " pod="metallb-system/speaker-wkt54" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.444256 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54802f9b-3130-4b2c-a22c-fdbc95388e66-cert\") pod \"controller-f8648f98b-bj6cc\" (UID: \"54802f9b-3130-4b2c-a22c-fdbc95388e66\") " pod="metallb-system/controller-f8648f98b-bj6cc" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.444290 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbfmj\" (UniqueName: \"kubernetes.io/projected/54802f9b-3130-4b2c-a22c-fdbc95388e66-kube-api-access-sbfmj\") pod \"controller-f8648f98b-bj6cc\" (UID: \"54802f9b-3130-4b2c-a22c-fdbc95388e66\") " pod="metallb-system/controller-f8648f98b-bj6cc" Dec 09 11:47:04 crc kubenswrapper[4745]: E1209 11:47:04.444575 4745 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 09 11:47:04 crc kubenswrapper[4745]: E1209 11:47:04.444659 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-memberlist podName:9ea06c36-f5da-472b-9ad1-5ab4401e89e2 nodeName:}" failed. No retries permitted until 2025-12-09 11:47:04.944637547 +0000 UTC m=+911.769839071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-memberlist") pod "speaker-wkt54" (UID: "9ea06c36-f5da-472b-9ad1-5ab4401e89e2") : secret "metallb-memberlist" not found Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.444988 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-metallb-excludel2\") pod \"speaker-wkt54\" (UID: \"9ea06c36-f5da-472b-9ad1-5ab4401e89e2\") " pod="metallb-system/speaker-wkt54" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.447863 4745 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.448125 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-metrics-certs\") pod \"speaker-wkt54\" (UID: \"9ea06c36-f5da-472b-9ad1-5ab4401e89e2\") " pod="metallb-system/speaker-wkt54" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.448975 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54802f9b-3130-4b2c-a22c-fdbc95388e66-metrics-certs\") pod \"controller-f8648f98b-bj6cc\" (UID: \"54802f9b-3130-4b2c-a22c-fdbc95388e66\") " pod="metallb-system/controller-f8648f98b-bj6cc" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.458825 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54802f9b-3130-4b2c-a22c-fdbc95388e66-cert\") pod \"controller-f8648f98b-bj6cc\" (UID: \"54802f9b-3130-4b2c-a22c-fdbc95388e66\") " pod="metallb-system/controller-f8648f98b-bj6cc" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.460719 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbfmj\" (UniqueName: \"kubernetes.io/projected/54802f9b-3130-4b2c-a22c-fdbc95388e66-kube-api-access-sbfmj\") pod \"controller-f8648f98b-bj6cc\" (UID: \"54802f9b-3130-4b2c-a22c-fdbc95388e66\") " pod="metallb-system/controller-f8648f98b-bj6cc" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.467412 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dnhb\" (UniqueName: \"kubernetes.io/projected/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-kube-api-access-2dnhb\") pod \"speaker-wkt54\" (UID: \"9ea06c36-f5da-472b-9ad1-5ab4401e89e2\") " pod="metallb-system/speaker-wkt54" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.491771 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-bj6cc" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.747487 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-metrics-certs\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.759130 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3144339c-91b6-4d03-ae8f-d7ba80ba67ae-metrics-certs\") pod \"frr-k8s-2j5gd\" (UID: \"3144339c-91b6-4d03-ae8f-d7ba80ba67ae\") " pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.902669 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-fsg2c"] Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.948996 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:04 crc kubenswrapper[4745]: I1209 11:47:04.950649 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-memberlist\") pod \"speaker-wkt54\" (UID: \"9ea06c36-f5da-472b-9ad1-5ab4401e89e2\") " pod="metallb-system/speaker-wkt54" Dec 09 11:47:04 crc kubenswrapper[4745]: E1209 11:47:04.950863 4745 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 09 11:47:04 crc kubenswrapper[4745]: E1209 11:47:04.950961 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-memberlist podName:9ea06c36-f5da-472b-9ad1-5ab4401e89e2 nodeName:}" failed. No retries permitted until 2025-12-09 11:47:05.95094127 +0000 UTC m=+912.776142794 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-memberlist") pod "speaker-wkt54" (UID: "9ea06c36-f5da-472b-9ad1-5ab4401e89e2") : secret "metallb-memberlist" not found Dec 09 11:47:05 crc kubenswrapper[4745]: I1209 11:47:05.013963 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-bj6cc"] Dec 09 11:47:05 crc kubenswrapper[4745]: W1209 11:47:05.017635 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54802f9b_3130_4b2c_a22c_fdbc95388e66.slice/crio-a3afab2820d097d78e87b2d6561f307eca9b7a11db73eeafec311356109260b3 WatchSource:0}: Error finding container a3afab2820d097d78e87b2d6561f307eca9b7a11db73eeafec311356109260b3: Status 404 returned error can't find the container with id a3afab2820d097d78e87b2d6561f307eca9b7a11db73eeafec311356109260b3 Dec 09 11:47:05 crc kubenswrapper[4745]: I1209 11:47:05.186548 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-bj6cc" event={"ID":"54802f9b-3130-4b2c-a22c-fdbc95388e66","Type":"ContainerStarted","Data":"a3afab2820d097d78e87b2d6561f307eca9b7a11db73eeafec311356109260b3"} Dec 09 11:47:05 crc kubenswrapper[4745]: I1209 11:47:05.187954 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fsg2c" event={"ID":"c6e40c28-b12b-4e80-80ab-a0d5cf254c9c","Type":"ContainerStarted","Data":"8586ff2a66977c8cb962a18a3845f4aa6f0771b0dd55d00d12a53a75fe739638"} Dec 09 11:47:05 crc kubenswrapper[4745]: I1209 11:47:05.963759 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-memberlist\") pod \"speaker-wkt54\" (UID: \"9ea06c36-f5da-472b-9ad1-5ab4401e89e2\") " pod="metallb-system/speaker-wkt54" Dec 09 11:47:05 crc kubenswrapper[4745]: I1209 11:47:05.969225 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9ea06c36-f5da-472b-9ad1-5ab4401e89e2-memberlist\") pod \"speaker-wkt54\" (UID: \"9ea06c36-f5da-472b-9ad1-5ab4401e89e2\") " pod="metallb-system/speaker-wkt54" Dec 09 11:47:05 crc kubenswrapper[4745]: I1209 11:47:05.979495 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wkt54" Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.198938 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2j5gd" event={"ID":"3144339c-91b6-4d03-ae8f-d7ba80ba67ae","Type":"ContainerStarted","Data":"97ae3d2a88b1bcb7c529b8f1f408a1ff094fb2afa3e641040523c6d60d24168d"} Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.212549 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wkt54" event={"ID":"9ea06c36-f5da-472b-9ad1-5ab4401e89e2","Type":"ContainerStarted","Data":"e6a8903864cf11c60405ba8a22024f5ee6523440e21ac0fe69c327d0a939e57c"} Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.217840 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rgkpt"] Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.224482 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rgkpt" podUID="e04a1d96-420e-4d4f-8c7f-bce46b811660" containerName="registry-server" containerID="cri-o://b7ecbc8be57cfaad48c9691a061a57a6827d8be1a4f04dbf3267a85a71d58764" gracePeriod=2 Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.225974 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-bj6cc" event={"ID":"54802f9b-3130-4b2c-a22c-fdbc95388e66","Type":"ContainerStarted","Data":"739fb0c0be2fc9aa364413f51f9366925077d81418ce081c6945ceee60e83b41"} Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.226028 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-bj6cc" event={"ID":"54802f9b-3130-4b2c-a22c-fdbc95388e66","Type":"ContainerStarted","Data":"4d8dc2736b5b8ecbab11cee2eebdcc608b4908f570bc8bfc2c154ef505806c8d"} Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.226049 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-bj6cc" Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.260208 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-bj6cc" podStartSLOduration=2.260175092 podStartE2EDuration="2.260175092s" podCreationTimestamp="2025-12-09 11:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:47:06.252127656 +0000 UTC m=+913.077329190" watchObservedRunningTime="2025-12-09 11:47:06.260175092 +0000 UTC m=+913.085376616" Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.718799 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.896341 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04a1d96-420e-4d4f-8c7f-bce46b811660-utilities\") pod \"e04a1d96-420e-4d4f-8c7f-bce46b811660\" (UID: \"e04a1d96-420e-4d4f-8c7f-bce46b811660\") " Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.896399 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04a1d96-420e-4d4f-8c7f-bce46b811660-catalog-content\") pod \"e04a1d96-420e-4d4f-8c7f-bce46b811660\" (UID: \"e04a1d96-420e-4d4f-8c7f-bce46b811660\") " Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.896419 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82fh4\" (UniqueName: \"kubernetes.io/projected/e04a1d96-420e-4d4f-8c7f-bce46b811660-kube-api-access-82fh4\") pod \"e04a1d96-420e-4d4f-8c7f-bce46b811660\" (UID: \"e04a1d96-420e-4d4f-8c7f-bce46b811660\") " Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.897585 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e04a1d96-420e-4d4f-8c7f-bce46b811660-utilities" (OuterVolumeSpecName: "utilities") pod "e04a1d96-420e-4d4f-8c7f-bce46b811660" (UID: "e04a1d96-420e-4d4f-8c7f-bce46b811660"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.902786 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e04a1d96-420e-4d4f-8c7f-bce46b811660-kube-api-access-82fh4" (OuterVolumeSpecName: "kube-api-access-82fh4") pod "e04a1d96-420e-4d4f-8c7f-bce46b811660" (UID: "e04a1d96-420e-4d4f-8c7f-bce46b811660"). InnerVolumeSpecName "kube-api-access-82fh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.959928 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e04a1d96-420e-4d4f-8c7f-bce46b811660-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e04a1d96-420e-4d4f-8c7f-bce46b811660" (UID: "e04a1d96-420e-4d4f-8c7f-bce46b811660"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.997902 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04a1d96-420e-4d4f-8c7f-bce46b811660-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.998205 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04a1d96-420e-4d4f-8c7f-bce46b811660-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:06 crc kubenswrapper[4745]: I1209 11:47:06.998218 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82fh4\" (UniqueName: \"kubernetes.io/projected/e04a1d96-420e-4d4f-8c7f-bce46b811660-kube-api-access-82fh4\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.258954 4745 generic.go:334] "Generic (PLEG): container finished" podID="e04a1d96-420e-4d4f-8c7f-bce46b811660" containerID="b7ecbc8be57cfaad48c9691a061a57a6827d8be1a4f04dbf3267a85a71d58764" exitCode=0 Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.259050 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rgkpt" event={"ID":"e04a1d96-420e-4d4f-8c7f-bce46b811660","Type":"ContainerDied","Data":"b7ecbc8be57cfaad48c9691a061a57a6827d8be1a4f04dbf3267a85a71d58764"} Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.259085 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rgkpt" event={"ID":"e04a1d96-420e-4d4f-8c7f-bce46b811660","Type":"ContainerDied","Data":"cbf26644eade3279d33395a25f9bb79b2a61797cb195777647c6da8e860e20cf"} Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.259107 4745 scope.go:117] "RemoveContainer" containerID="b7ecbc8be57cfaad48c9691a061a57a6827d8be1a4f04dbf3267a85a71d58764" Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.259278 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rgkpt" Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.296003 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wkt54" event={"ID":"9ea06c36-f5da-472b-9ad1-5ab4401e89e2","Type":"ContainerStarted","Data":"2231e6fa67bc62a82c53be768b57eed34cb93f63ae693008d4cc987a332850be"} Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.296084 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wkt54" event={"ID":"9ea06c36-f5da-472b-9ad1-5ab4401e89e2","Type":"ContainerStarted","Data":"dafa4c8fd440df9bc69d305b0bbe7371be4c51d65b580e92b40fa1c2a08ece5a"} Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.296152 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-wkt54" Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.330057 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-wkt54" podStartSLOduration=3.33003969 podStartE2EDuration="3.33003969s" podCreationTimestamp="2025-12-09 11:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:47:07.325783826 +0000 UTC m=+914.150985350" watchObservedRunningTime="2025-12-09 11:47:07.33003969 +0000 UTC m=+914.155241214" Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.330712 4745 scope.go:117] "RemoveContainer" containerID="1e25690a6e6c94db903234bbf171492d71b2debebb6b405f4d6b3ba7d7dc4da9" Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.357128 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rgkpt"] Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.361530 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rgkpt"] Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.398308 4745 scope.go:117] "RemoveContainer" containerID="6f7c5a882ef46326d84645be3a19b05875e758e0a849fdddbddd7ebb1070ec17" Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.442400 4745 scope.go:117] "RemoveContainer" containerID="b7ecbc8be57cfaad48c9691a061a57a6827d8be1a4f04dbf3267a85a71d58764" Dec 09 11:47:07 crc kubenswrapper[4745]: E1209 11:47:07.447169 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7ecbc8be57cfaad48c9691a061a57a6827d8be1a4f04dbf3267a85a71d58764\": container with ID starting with b7ecbc8be57cfaad48c9691a061a57a6827d8be1a4f04dbf3267a85a71d58764 not found: ID does not exist" containerID="b7ecbc8be57cfaad48c9691a061a57a6827d8be1a4f04dbf3267a85a71d58764" Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.447227 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7ecbc8be57cfaad48c9691a061a57a6827d8be1a4f04dbf3267a85a71d58764"} err="failed to get container status \"b7ecbc8be57cfaad48c9691a061a57a6827d8be1a4f04dbf3267a85a71d58764\": rpc error: code = NotFound desc = could not find container \"b7ecbc8be57cfaad48c9691a061a57a6827d8be1a4f04dbf3267a85a71d58764\": container with ID starting with b7ecbc8be57cfaad48c9691a061a57a6827d8be1a4f04dbf3267a85a71d58764 not found: ID does not exist" Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.447306 4745 scope.go:117] "RemoveContainer" containerID="1e25690a6e6c94db903234bbf171492d71b2debebb6b405f4d6b3ba7d7dc4da9" Dec 09 11:47:07 crc kubenswrapper[4745]: E1209 11:47:07.448864 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e25690a6e6c94db903234bbf171492d71b2debebb6b405f4d6b3ba7d7dc4da9\": container with ID starting with 1e25690a6e6c94db903234bbf171492d71b2debebb6b405f4d6b3ba7d7dc4da9 not found: ID does not exist" containerID="1e25690a6e6c94db903234bbf171492d71b2debebb6b405f4d6b3ba7d7dc4da9" Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.448892 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e25690a6e6c94db903234bbf171492d71b2debebb6b405f4d6b3ba7d7dc4da9"} err="failed to get container status \"1e25690a6e6c94db903234bbf171492d71b2debebb6b405f4d6b3ba7d7dc4da9\": rpc error: code = NotFound desc = could not find container \"1e25690a6e6c94db903234bbf171492d71b2debebb6b405f4d6b3ba7d7dc4da9\": container with ID starting with 1e25690a6e6c94db903234bbf171492d71b2debebb6b405f4d6b3ba7d7dc4da9 not found: ID does not exist" Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.448912 4745 scope.go:117] "RemoveContainer" containerID="6f7c5a882ef46326d84645be3a19b05875e758e0a849fdddbddd7ebb1070ec17" Dec 09 11:47:07 crc kubenswrapper[4745]: E1209 11:47:07.449302 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f7c5a882ef46326d84645be3a19b05875e758e0a849fdddbddd7ebb1070ec17\": container with ID starting with 6f7c5a882ef46326d84645be3a19b05875e758e0a849fdddbddd7ebb1070ec17 not found: ID does not exist" containerID="6f7c5a882ef46326d84645be3a19b05875e758e0a849fdddbddd7ebb1070ec17" Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.449354 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f7c5a882ef46326d84645be3a19b05875e758e0a849fdddbddd7ebb1070ec17"} err="failed to get container status \"6f7c5a882ef46326d84645be3a19b05875e758e0a849fdddbddd7ebb1070ec17\": rpc error: code = NotFound desc = could not find container \"6f7c5a882ef46326d84645be3a19b05875e758e0a849fdddbddd7ebb1070ec17\": container with ID starting with 6f7c5a882ef46326d84645be3a19b05875e758e0a849fdddbddd7ebb1070ec17 not found: ID does not exist" Dec 09 11:47:07 crc kubenswrapper[4745]: I1209 11:47:07.568049 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e04a1d96-420e-4d4f-8c7f-bce46b811660" path="/var/lib/kubelet/pods/e04a1d96-420e-4d4f-8c7f-bce46b811660/volumes" Dec 09 11:47:09 crc kubenswrapper[4745]: I1209 11:47:09.016648 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xbtd"] Dec 09 11:47:09 crc kubenswrapper[4745]: I1209 11:47:09.016967 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4xbtd" podUID="0acc446b-fe2e-40dd-8e91-9c62addbfe66" containerName="registry-server" containerID="cri-o://b856235c0711cd6c88d6d213defc9145701599117d2cfd92ec40863d9b767423" gracePeriod=2 Dec 09 11:47:09 crc kubenswrapper[4745]: I1209 11:47:09.313358 4745 generic.go:334] "Generic (PLEG): container finished" podID="0acc446b-fe2e-40dd-8e91-9c62addbfe66" containerID="b856235c0711cd6c88d6d213defc9145701599117d2cfd92ec40863d9b767423" exitCode=0 Dec 09 11:47:09 crc kubenswrapper[4745]: I1209 11:47:09.313660 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xbtd" event={"ID":"0acc446b-fe2e-40dd-8e91-9c62addbfe66","Type":"ContainerDied","Data":"b856235c0711cd6c88d6d213defc9145701599117d2cfd92ec40863d9b767423"} Dec 09 11:47:09 crc kubenswrapper[4745]: I1209 11:47:09.777696 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:47:09 crc kubenswrapper[4745]: I1209 11:47:09.945874 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0acc446b-fe2e-40dd-8e91-9c62addbfe66-catalog-content\") pod \"0acc446b-fe2e-40dd-8e91-9c62addbfe66\" (UID: \"0acc446b-fe2e-40dd-8e91-9c62addbfe66\") " Dec 09 11:47:09 crc kubenswrapper[4745]: I1209 11:47:09.946010 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0acc446b-fe2e-40dd-8e91-9c62addbfe66-utilities\") pod \"0acc446b-fe2e-40dd-8e91-9c62addbfe66\" (UID: \"0acc446b-fe2e-40dd-8e91-9c62addbfe66\") " Dec 09 11:47:09 crc kubenswrapper[4745]: I1209 11:47:09.946162 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4gc4\" (UniqueName: \"kubernetes.io/projected/0acc446b-fe2e-40dd-8e91-9c62addbfe66-kube-api-access-q4gc4\") pod \"0acc446b-fe2e-40dd-8e91-9c62addbfe66\" (UID: \"0acc446b-fe2e-40dd-8e91-9c62addbfe66\") " Dec 09 11:47:09 crc kubenswrapper[4745]: I1209 11:47:09.948213 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0acc446b-fe2e-40dd-8e91-9c62addbfe66-utilities" (OuterVolumeSpecName: "utilities") pod "0acc446b-fe2e-40dd-8e91-9c62addbfe66" (UID: "0acc446b-fe2e-40dd-8e91-9c62addbfe66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:47:09 crc kubenswrapper[4745]: I1209 11:47:09.969275 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0acc446b-fe2e-40dd-8e91-9c62addbfe66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0acc446b-fe2e-40dd-8e91-9c62addbfe66" (UID: "0acc446b-fe2e-40dd-8e91-9c62addbfe66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:47:10 crc kubenswrapper[4745]: I1209 11:47:10.043214 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0acc446b-fe2e-40dd-8e91-9c62addbfe66-kube-api-access-q4gc4" (OuterVolumeSpecName: "kube-api-access-q4gc4") pod "0acc446b-fe2e-40dd-8e91-9c62addbfe66" (UID: "0acc446b-fe2e-40dd-8e91-9c62addbfe66"). InnerVolumeSpecName "kube-api-access-q4gc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:47:10 crc kubenswrapper[4745]: I1209 11:47:10.050637 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0acc446b-fe2e-40dd-8e91-9c62addbfe66-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:10 crc kubenswrapper[4745]: I1209 11:47:10.050704 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4gc4\" (UniqueName: \"kubernetes.io/projected/0acc446b-fe2e-40dd-8e91-9c62addbfe66-kube-api-access-q4gc4\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:10 crc kubenswrapper[4745]: I1209 11:47:10.050719 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0acc446b-fe2e-40dd-8e91-9c62addbfe66-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:10 crc kubenswrapper[4745]: I1209 11:47:10.320822 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xbtd" event={"ID":"0acc446b-fe2e-40dd-8e91-9c62addbfe66","Type":"ContainerDied","Data":"92449c0c28f7ba7a3404d1eb67ce7b59379a4d78bdcb9047748705504e8abeba"} Dec 09 11:47:10 crc kubenswrapper[4745]: I1209 11:47:10.320928 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xbtd" Dec 09 11:47:10 crc kubenswrapper[4745]: I1209 11:47:10.321130 4745 scope.go:117] "RemoveContainer" containerID="b856235c0711cd6c88d6d213defc9145701599117d2cfd92ec40863d9b767423" Dec 09 11:47:10 crc kubenswrapper[4745]: I1209 11:47:10.351855 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xbtd"] Dec 09 11:47:10 crc kubenswrapper[4745]: I1209 11:47:10.354241 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xbtd"] Dec 09 11:47:11 crc kubenswrapper[4745]: I1209 11:47:11.568469 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0acc446b-fe2e-40dd-8e91-9c62addbfe66" path="/var/lib/kubelet/pods/0acc446b-fe2e-40dd-8e91-9c62addbfe66/volumes" Dec 09 11:47:15 crc kubenswrapper[4745]: I1209 11:47:15.323711 4745 scope.go:117] "RemoveContainer" containerID="27a546cb207ab9af57757fbd389e837d957d709942d9a856926f68b56dac86ca" Dec 09 11:47:15 crc kubenswrapper[4745]: I1209 11:47:15.353922 4745 scope.go:117] "RemoveContainer" containerID="2764323453d976b5ea2f52807d8ae34e8b5b926a7cfa9f828586fb0a3a77c80a" Dec 09 11:47:16 crc kubenswrapper[4745]: I1209 11:47:16.511931 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fsg2c" event={"ID":"c6e40c28-b12b-4e80-80ab-a0d5cf254c9c","Type":"ContainerStarted","Data":"b339c6dddb1cb3701695c1edf5a962d41d43c8f2b70cd7ee76bf32f5e95640f6"} Dec 09 11:47:16 crc kubenswrapper[4745]: I1209 11:47:16.512550 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fsg2c" Dec 09 11:47:16 crc kubenswrapper[4745]: I1209 11:47:16.513729 4745 generic.go:334] "Generic (PLEG): container finished" podID="3144339c-91b6-4d03-ae8f-d7ba80ba67ae" containerID="7e74da95af44b8f5d51500a26de968184131c0e2efbae42732a53920d2753907" exitCode=0 Dec 09 11:47:16 crc kubenswrapper[4745]: I1209 11:47:16.513774 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2j5gd" event={"ID":"3144339c-91b6-4d03-ae8f-d7ba80ba67ae","Type":"ContainerDied","Data":"7e74da95af44b8f5d51500a26de968184131c0e2efbae42732a53920d2753907"} Dec 09 11:47:16 crc kubenswrapper[4745]: I1209 11:47:16.540105 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fsg2c" podStartSLOduration=1.616452129 podStartE2EDuration="12.540074654s" podCreationTimestamp="2025-12-09 11:47:04 +0000 UTC" firstStartedPulling="2025-12-09 11:47:04.921704072 +0000 UTC m=+911.746905596" lastFinishedPulling="2025-12-09 11:47:15.845326597 +0000 UTC m=+922.670528121" observedRunningTime="2025-12-09 11:47:16.531706369 +0000 UTC m=+923.356907913" watchObservedRunningTime="2025-12-09 11:47:16.540074654 +0000 UTC m=+923.365276178" Dec 09 11:47:17 crc kubenswrapper[4745]: I1209 11:47:17.520841 4745 generic.go:334] "Generic (PLEG): container finished" podID="3144339c-91b6-4d03-ae8f-d7ba80ba67ae" containerID="54ad3711b20a2577a6935cf5fb1839dfb3377f2d8f6d994ea226f34095b55c11" exitCode=0 Dec 09 11:47:17 crc kubenswrapper[4745]: I1209 11:47:17.520950 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2j5gd" event={"ID":"3144339c-91b6-4d03-ae8f-d7ba80ba67ae","Type":"ContainerDied","Data":"54ad3711b20a2577a6935cf5fb1839dfb3377f2d8f6d994ea226f34095b55c11"} Dec 09 11:47:18 crc kubenswrapper[4745]: I1209 11:47:18.528021 4745 generic.go:334] "Generic (PLEG): container finished" podID="3144339c-91b6-4d03-ae8f-d7ba80ba67ae" containerID="16c93388cf32b66363d8f978d159c8763feefa3c2eb2339b12094ccc08cbfca3" exitCode=0 Dec 09 11:47:18 crc kubenswrapper[4745]: I1209 11:47:18.528065 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2j5gd" event={"ID":"3144339c-91b6-4d03-ae8f-d7ba80ba67ae","Type":"ContainerDied","Data":"16c93388cf32b66363d8f978d159c8763feefa3c2eb2339b12094ccc08cbfca3"} Dec 09 11:47:19 crc kubenswrapper[4745]: I1209 11:47:19.537282 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2j5gd" event={"ID":"3144339c-91b6-4d03-ae8f-d7ba80ba67ae","Type":"ContainerStarted","Data":"674537ea7c7e0bd4f5b535cbd417dda86772e3d87ab1c4800972446527813b77"} Dec 09 11:47:22 crc kubenswrapper[4745]: I1209 11:47:22.563666 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2j5gd" event={"ID":"3144339c-91b6-4d03-ae8f-d7ba80ba67ae","Type":"ContainerStarted","Data":"29434644c83307e1f176e67de2702397f1d2521f6dd674d5fba272b0fbb2e0b4"} Dec 09 11:47:22 crc kubenswrapper[4745]: I1209 11:47:22.564184 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2j5gd" event={"ID":"3144339c-91b6-4d03-ae8f-d7ba80ba67ae","Type":"ContainerStarted","Data":"7879154bb85006758333d4404a7b0a3eb432e3058ef09a735d9df4ae4ae8f618"} Dec 09 11:47:22 crc kubenswrapper[4745]: I1209 11:47:22.564203 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2j5gd" event={"ID":"3144339c-91b6-4d03-ae8f-d7ba80ba67ae","Type":"ContainerStarted","Data":"84bd9013da1dddb87a9f667d95ddd7a41f718792bcfc3434635f4e2f3a1fd29b"} Dec 09 11:47:23 crc kubenswrapper[4745]: I1209 11:47:23.574097 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2j5gd" event={"ID":"3144339c-91b6-4d03-ae8f-d7ba80ba67ae","Type":"ContainerStarted","Data":"6a8cc191e6472f325b50de7c4dde6412e1d725064325b341acc91b4a0654cb35"} Dec 09 11:47:24 crc kubenswrapper[4745]: I1209 11:47:24.496534 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-bj6cc" Dec 09 11:47:24 crc kubenswrapper[4745]: I1209 11:47:24.586582 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2j5gd" event={"ID":"3144339c-91b6-4d03-ae8f-d7ba80ba67ae","Type":"ContainerStarted","Data":"a45a254c8fa9092a84dacffb15846f2a3d42c23bd40024b83ec7a849ed7a836b"} Dec 09 11:47:24 crc kubenswrapper[4745]: I1209 11:47:24.586773 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:24 crc kubenswrapper[4745]: I1209 11:47:24.798067 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-2j5gd" podStartSLOduration=11.366544039 podStartE2EDuration="21.798050373s" podCreationTimestamp="2025-12-09 11:47:03 +0000 UTC" firstStartedPulling="2025-12-09 11:47:05.396505367 +0000 UTC m=+912.221706891" lastFinishedPulling="2025-12-09 11:47:15.828011701 +0000 UTC m=+922.653213225" observedRunningTime="2025-12-09 11:47:24.796894682 +0000 UTC m=+931.622096216" watchObservedRunningTime="2025-12-09 11:47:24.798050373 +0000 UTC m=+931.623251897" Dec 09 11:47:24 crc kubenswrapper[4745]: I1209 11:47:24.949449 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:24 crc kubenswrapper[4745]: I1209 11:47:24.998244 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:25 crc kubenswrapper[4745]: I1209 11:47:25.475833 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:47:25 crc kubenswrapper[4745]: I1209 11:47:25.475960 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:47:25 crc kubenswrapper[4745]: I1209 11:47:25.992576 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-wkt54" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.396006 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4"] Dec 09 11:47:27 crc kubenswrapper[4745]: E1209 11:47:27.396246 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0acc446b-fe2e-40dd-8e91-9c62addbfe66" containerName="extract-content" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.396258 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0acc446b-fe2e-40dd-8e91-9c62addbfe66" containerName="extract-content" Dec 09 11:47:27 crc kubenswrapper[4745]: E1209 11:47:27.396268 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e04a1d96-420e-4d4f-8c7f-bce46b811660" containerName="extract-content" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.396274 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04a1d96-420e-4d4f-8c7f-bce46b811660" containerName="extract-content" Dec 09 11:47:27 crc kubenswrapper[4745]: E1209 11:47:27.396286 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0acc446b-fe2e-40dd-8e91-9c62addbfe66" containerName="registry-server" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.396292 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0acc446b-fe2e-40dd-8e91-9c62addbfe66" containerName="registry-server" Dec 09 11:47:27 crc kubenswrapper[4745]: E1209 11:47:27.396299 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e04a1d96-420e-4d4f-8c7f-bce46b811660" containerName="extract-utilities" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.396304 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04a1d96-420e-4d4f-8c7f-bce46b811660" containerName="extract-utilities" Dec 09 11:47:27 crc kubenswrapper[4745]: E1209 11:47:27.396315 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0acc446b-fe2e-40dd-8e91-9c62addbfe66" containerName="extract-utilities" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.396321 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0acc446b-fe2e-40dd-8e91-9c62addbfe66" containerName="extract-utilities" Dec 09 11:47:27 crc kubenswrapper[4745]: E1209 11:47:27.396333 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e04a1d96-420e-4d4f-8c7f-bce46b811660" containerName="registry-server" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.396338 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04a1d96-420e-4d4f-8c7f-bce46b811660" containerName="registry-server" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.396429 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="0acc446b-fe2e-40dd-8e91-9c62addbfe66" containerName="registry-server" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.396440 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e04a1d96-420e-4d4f-8c7f-bce46b811660" containerName="registry-server" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.397209 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.400567 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.411780 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4"] Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.459289 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae7d9487-15ef-4298-914c-229ff735554f-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4\" (UID: \"ae7d9487-15ef-4298-914c-229ff735554f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.459368 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfvp7\" (UniqueName: \"kubernetes.io/projected/ae7d9487-15ef-4298-914c-229ff735554f-kube-api-access-qfvp7\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4\" (UID: \"ae7d9487-15ef-4298-914c-229ff735554f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.459401 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae7d9487-15ef-4298-914c-229ff735554f-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4\" (UID: \"ae7d9487-15ef-4298-914c-229ff735554f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.560291 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae7d9487-15ef-4298-914c-229ff735554f-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4\" (UID: \"ae7d9487-15ef-4298-914c-229ff735554f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.560360 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfvp7\" (UniqueName: \"kubernetes.io/projected/ae7d9487-15ef-4298-914c-229ff735554f-kube-api-access-qfvp7\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4\" (UID: \"ae7d9487-15ef-4298-914c-229ff735554f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.560386 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae7d9487-15ef-4298-914c-229ff735554f-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4\" (UID: \"ae7d9487-15ef-4298-914c-229ff735554f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.561006 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae7d9487-15ef-4298-914c-229ff735554f-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4\" (UID: \"ae7d9487-15ef-4298-914c-229ff735554f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.561084 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae7d9487-15ef-4298-914c-229ff735554f-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4\" (UID: \"ae7d9487-15ef-4298-914c-229ff735554f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.580627 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfvp7\" (UniqueName: \"kubernetes.io/projected/ae7d9487-15ef-4298-914c-229ff735554f-kube-api-access-qfvp7\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4\" (UID: \"ae7d9487-15ef-4298-914c-229ff735554f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" Dec 09 11:47:27 crc kubenswrapper[4745]: I1209 11:47:27.715582 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" Dec 09 11:47:28 crc kubenswrapper[4745]: I1209 11:47:28.306027 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4"] Dec 09 11:47:28 crc kubenswrapper[4745]: W1209 11:47:28.312464 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae7d9487_15ef_4298_914c_229ff735554f.slice/crio-8f8e2fb333598e6cbb7a674a5e7ccbb59eaf0bd25683799afbbb5537931fb9ba WatchSource:0}: Error finding container 8f8e2fb333598e6cbb7a674a5e7ccbb59eaf0bd25683799afbbb5537931fb9ba: Status 404 returned error can't find the container with id 8f8e2fb333598e6cbb7a674a5e7ccbb59eaf0bd25683799afbbb5537931fb9ba Dec 09 11:47:28 crc kubenswrapper[4745]: I1209 11:47:28.621351 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" event={"ID":"ae7d9487-15ef-4298-914c-229ff735554f","Type":"ContainerStarted","Data":"8f8e2fb333598e6cbb7a674a5e7ccbb59eaf0bd25683799afbbb5537931fb9ba"} Dec 09 11:47:29 crc kubenswrapper[4745]: I1209 11:47:29.629969 4745 generic.go:334] "Generic (PLEG): container finished" podID="ae7d9487-15ef-4298-914c-229ff735554f" containerID="a135ce320e88c8b63db4f44ee987697d2568acdfbfaa1b4aff61caf7ac1454de" exitCode=0 Dec 09 11:47:29 crc kubenswrapper[4745]: I1209 11:47:29.630020 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" event={"ID":"ae7d9487-15ef-4298-914c-229ff735554f","Type":"ContainerDied","Data":"a135ce320e88c8b63db4f44ee987697d2568acdfbfaa1b4aff61caf7ac1454de"} Dec 09 11:47:34 crc kubenswrapper[4745]: I1209 11:47:34.367278 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fsg2c" Dec 09 11:47:34 crc kubenswrapper[4745]: I1209 11:47:34.954575 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-2j5gd" Dec 09 11:47:44 crc kubenswrapper[4745]: I1209 11:47:44.931148 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" event={"ID":"ae7d9487-15ef-4298-914c-229ff735554f","Type":"ContainerStarted","Data":"e4d5c13a99803a80a363cbc2d008464cde8fe45427a4fefd33d79fa3e6190930"} Dec 09 11:47:45 crc kubenswrapper[4745]: I1209 11:47:45.939924 4745 generic.go:334] "Generic (PLEG): container finished" podID="ae7d9487-15ef-4298-914c-229ff735554f" containerID="e4d5c13a99803a80a363cbc2d008464cde8fe45427a4fefd33d79fa3e6190930" exitCode=0 Dec 09 11:47:45 crc kubenswrapper[4745]: I1209 11:47:45.940006 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" event={"ID":"ae7d9487-15ef-4298-914c-229ff735554f","Type":"ContainerDied","Data":"e4d5c13a99803a80a363cbc2d008464cde8fe45427a4fefd33d79fa3e6190930"} Dec 09 11:47:46 crc kubenswrapper[4745]: I1209 11:47:46.949792 4745 generic.go:334] "Generic (PLEG): container finished" podID="ae7d9487-15ef-4298-914c-229ff735554f" containerID="f35f39b46af6d47fb0a6191f0a4334de9445ac1bf802253733904432dd8fd9d4" exitCode=0 Dec 09 11:47:46 crc kubenswrapper[4745]: I1209 11:47:46.949846 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" event={"ID":"ae7d9487-15ef-4298-914c-229ff735554f","Type":"ContainerDied","Data":"f35f39b46af6d47fb0a6191f0a4334de9445ac1bf802253733904432dd8fd9d4"} Dec 09 11:47:48 crc kubenswrapper[4745]: I1209 11:47:48.252157 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" Dec 09 11:47:48 crc kubenswrapper[4745]: I1209 11:47:48.383668 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfvp7\" (UniqueName: \"kubernetes.io/projected/ae7d9487-15ef-4298-914c-229ff735554f-kube-api-access-qfvp7\") pod \"ae7d9487-15ef-4298-914c-229ff735554f\" (UID: \"ae7d9487-15ef-4298-914c-229ff735554f\") " Dec 09 11:47:48 crc kubenswrapper[4745]: I1209 11:47:48.383725 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae7d9487-15ef-4298-914c-229ff735554f-util\") pod \"ae7d9487-15ef-4298-914c-229ff735554f\" (UID: \"ae7d9487-15ef-4298-914c-229ff735554f\") " Dec 09 11:47:48 crc kubenswrapper[4745]: I1209 11:47:48.383764 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae7d9487-15ef-4298-914c-229ff735554f-bundle\") pod \"ae7d9487-15ef-4298-914c-229ff735554f\" (UID: \"ae7d9487-15ef-4298-914c-229ff735554f\") " Dec 09 11:47:48 crc kubenswrapper[4745]: I1209 11:47:48.384995 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae7d9487-15ef-4298-914c-229ff735554f-bundle" (OuterVolumeSpecName: "bundle") pod "ae7d9487-15ef-4298-914c-229ff735554f" (UID: "ae7d9487-15ef-4298-914c-229ff735554f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:47:48 crc kubenswrapper[4745]: I1209 11:47:48.390702 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae7d9487-15ef-4298-914c-229ff735554f-kube-api-access-qfvp7" (OuterVolumeSpecName: "kube-api-access-qfvp7") pod "ae7d9487-15ef-4298-914c-229ff735554f" (UID: "ae7d9487-15ef-4298-914c-229ff735554f"). InnerVolumeSpecName "kube-api-access-qfvp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:47:48 crc kubenswrapper[4745]: I1209 11:47:48.395200 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae7d9487-15ef-4298-914c-229ff735554f-util" (OuterVolumeSpecName: "util") pod "ae7d9487-15ef-4298-914c-229ff735554f" (UID: "ae7d9487-15ef-4298-914c-229ff735554f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:47:48 crc kubenswrapper[4745]: I1209 11:47:48.486115 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfvp7\" (UniqueName: \"kubernetes.io/projected/ae7d9487-15ef-4298-914c-229ff735554f-kube-api-access-qfvp7\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:48 crc kubenswrapper[4745]: I1209 11:47:48.486163 4745 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae7d9487-15ef-4298-914c-229ff735554f-util\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:48 crc kubenswrapper[4745]: I1209 11:47:48.486175 4745 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae7d9487-15ef-4298-914c-229ff735554f-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:47:48 crc kubenswrapper[4745]: I1209 11:47:48.966222 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" event={"ID":"ae7d9487-15ef-4298-914c-229ff735554f","Type":"ContainerDied","Data":"8f8e2fb333598e6cbb7a674a5e7ccbb59eaf0bd25683799afbbb5537931fb9ba"} Dec 09 11:47:48 crc kubenswrapper[4745]: I1209 11:47:48.966666 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f8e2fb333598e6cbb7a674a5e7ccbb59eaf0bd25683799afbbb5537931fb9ba" Dec 09 11:47:48 crc kubenswrapper[4745]: I1209 11:47:48.966311 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.472690 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-vxwbh"] Dec 09 11:47:55 crc kubenswrapper[4745]: E1209 11:47:55.473505 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7d9487-15ef-4298-914c-229ff735554f" containerName="util" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.473539 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7d9487-15ef-4298-914c-229ff735554f" containerName="util" Dec 09 11:47:55 crc kubenswrapper[4745]: E1209 11:47:55.473554 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7d9487-15ef-4298-914c-229ff735554f" containerName="pull" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.473561 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7d9487-15ef-4298-914c-229ff735554f" containerName="pull" Dec 09 11:47:55 crc kubenswrapper[4745]: E1209 11:47:55.473576 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7d9487-15ef-4298-914c-229ff735554f" containerName="extract" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.473583 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7d9487-15ef-4298-914c-229ff735554f" containerName="extract" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.473704 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7d9487-15ef-4298-914c-229ff735554f" containerName="extract" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.474260 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-vxwbh" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.474981 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.475105 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.475176 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.475821 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6250b21aafb6740efccc2c70a8117a20c7fea1d66182660852cc7179196a393f"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.475886 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://6250b21aafb6740efccc2c70a8117a20c7fea1d66182660852cc7179196a393f" gracePeriod=600 Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.479316 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.479559 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.479711 4745 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-hsttq" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.493185 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wnsv\" (UniqueName: \"kubernetes.io/projected/bf9a2ad4-328c-4c79-b29e-01320d33b98c-kube-api-access-8wnsv\") pod \"cert-manager-operator-controller-manager-64cf6dff88-vxwbh\" (UID: \"bf9a2ad4-328c-4c79-b29e-01320d33b98c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-vxwbh" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.493643 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf9a2ad4-328c-4c79-b29e-01320d33b98c-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-vxwbh\" (UID: \"bf9a2ad4-328c-4c79-b29e-01320d33b98c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-vxwbh" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.499573 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-vxwbh"] Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.595389 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wnsv\" (UniqueName: \"kubernetes.io/projected/bf9a2ad4-328c-4c79-b29e-01320d33b98c-kube-api-access-8wnsv\") pod \"cert-manager-operator-controller-manager-64cf6dff88-vxwbh\" (UID: \"bf9a2ad4-328c-4c79-b29e-01320d33b98c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-vxwbh" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.595467 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf9a2ad4-328c-4c79-b29e-01320d33b98c-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-vxwbh\" (UID: \"bf9a2ad4-328c-4c79-b29e-01320d33b98c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-vxwbh" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.596206 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf9a2ad4-328c-4c79-b29e-01320d33b98c-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-vxwbh\" (UID: \"bf9a2ad4-328c-4c79-b29e-01320d33b98c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-vxwbh" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.621203 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wnsv\" (UniqueName: \"kubernetes.io/projected/bf9a2ad4-328c-4c79-b29e-01320d33b98c-kube-api-access-8wnsv\") pod \"cert-manager-operator-controller-manager-64cf6dff88-vxwbh\" (UID: \"bf9a2ad4-328c-4c79-b29e-01320d33b98c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-vxwbh" Dec 09 11:47:55 crc kubenswrapper[4745]: I1209 11:47:55.792446 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-vxwbh" Dec 09 11:47:56 crc kubenswrapper[4745]: I1209 11:47:56.049248 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="6250b21aafb6740efccc2c70a8117a20c7fea1d66182660852cc7179196a393f" exitCode=0 Dec 09 11:47:56 crc kubenswrapper[4745]: I1209 11:47:56.049712 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"6250b21aafb6740efccc2c70a8117a20c7fea1d66182660852cc7179196a393f"} Dec 09 11:47:56 crc kubenswrapper[4745]: I1209 11:47:56.050034 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"ae7429356c582a02b4d2b13febbec9379c05cf9bbc3bd01266102d2aeec55a48"} Dec 09 11:47:56 crc kubenswrapper[4745]: I1209 11:47:56.050057 4745 scope.go:117] "RemoveContainer" containerID="ca79ed29df3b434b12e331278bd50ec4ed3dddcbfabf7d15e6a04655fcdb16e2" Dec 09 11:47:56 crc kubenswrapper[4745]: I1209 11:47:56.155062 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-vxwbh"] Dec 09 11:47:56 crc kubenswrapper[4745]: W1209 11:47:56.163826 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf9a2ad4_328c_4c79_b29e_01320d33b98c.slice/crio-906a09ba04f401499329f2a808e83f537cdeb303ce7929a8aea823b0e1fafaa5 WatchSource:0}: Error finding container 906a09ba04f401499329f2a808e83f537cdeb303ce7929a8aea823b0e1fafaa5: Status 404 returned error can't find the container with id 906a09ba04f401499329f2a808e83f537cdeb303ce7929a8aea823b0e1fafaa5 Dec 09 11:47:57 crc kubenswrapper[4745]: I1209 11:47:57.058384 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-vxwbh" event={"ID":"bf9a2ad4-328c-4c79-b29e-01320d33b98c","Type":"ContainerStarted","Data":"906a09ba04f401499329f2a808e83f537cdeb303ce7929a8aea823b0e1fafaa5"} Dec 09 11:48:07 crc kubenswrapper[4745]: I1209 11:48:07.141010 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-vxwbh" event={"ID":"bf9a2ad4-328c-4c79-b29e-01320d33b98c","Type":"ContainerStarted","Data":"638ca257cbbef7d0cf0c33bd8a09f522547932447fcdb72ab04d08c0b7a074eb"} Dec 09 11:48:07 crc kubenswrapper[4745]: I1209 11:48:07.159360 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-vxwbh" podStartSLOduration=1.979614252 podStartE2EDuration="12.159341428s" podCreationTimestamp="2025-12-09 11:47:55 +0000 UTC" firstStartedPulling="2025-12-09 11:47:56.166030186 +0000 UTC m=+962.991231710" lastFinishedPulling="2025-12-09 11:48:06.345757362 +0000 UTC m=+973.170958886" observedRunningTime="2025-12-09 11:48:07.156847941 +0000 UTC m=+973.982049465" watchObservedRunningTime="2025-12-09 11:48:07.159341428 +0000 UTC m=+973.984542952" Dec 09 11:48:11 crc kubenswrapper[4745]: I1209 11:48:11.191799 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-th5c6"] Dec 09 11:48:11 crc kubenswrapper[4745]: I1209 11:48:11.193284 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-th5c6" Dec 09 11:48:11 crc kubenswrapper[4745]: I1209 11:48:11.195421 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 09 11:48:11 crc kubenswrapper[4745]: I1209 11:48:11.196068 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 09 11:48:11 crc kubenswrapper[4745]: I1209 11:48:11.196368 4745 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-mmt9j" Dec 09 11:48:11 crc kubenswrapper[4745]: I1209 11:48:11.208929 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-th5c6"] Dec 09 11:48:11 crc kubenswrapper[4745]: I1209 11:48:11.320345 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb6481d3-d46e-41a3-8400-caf27b4f3757-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-th5c6\" (UID: \"bb6481d3-d46e-41a3-8400-caf27b4f3757\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-th5c6" Dec 09 11:48:11 crc kubenswrapper[4745]: I1209 11:48:11.320440 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55mt6\" (UniqueName: \"kubernetes.io/projected/bb6481d3-d46e-41a3-8400-caf27b4f3757-kube-api-access-55mt6\") pod \"cert-manager-webhook-f4fb5df64-th5c6\" (UID: \"bb6481d3-d46e-41a3-8400-caf27b4f3757\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-th5c6" Dec 09 11:48:11 crc kubenswrapper[4745]: I1209 11:48:11.422089 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55mt6\" (UniqueName: \"kubernetes.io/projected/bb6481d3-d46e-41a3-8400-caf27b4f3757-kube-api-access-55mt6\") pod \"cert-manager-webhook-f4fb5df64-th5c6\" (UID: \"bb6481d3-d46e-41a3-8400-caf27b4f3757\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-th5c6" Dec 09 11:48:11 crc kubenswrapper[4745]: I1209 11:48:11.422260 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb6481d3-d46e-41a3-8400-caf27b4f3757-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-th5c6\" (UID: \"bb6481d3-d46e-41a3-8400-caf27b4f3757\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-th5c6" Dec 09 11:48:11 crc kubenswrapper[4745]: I1209 11:48:11.441440 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb6481d3-d46e-41a3-8400-caf27b4f3757-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-th5c6\" (UID: \"bb6481d3-d46e-41a3-8400-caf27b4f3757\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-th5c6" Dec 09 11:48:11 crc kubenswrapper[4745]: I1209 11:48:11.442502 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55mt6\" (UniqueName: \"kubernetes.io/projected/bb6481d3-d46e-41a3-8400-caf27b4f3757-kube-api-access-55mt6\") pod \"cert-manager-webhook-f4fb5df64-th5c6\" (UID: \"bb6481d3-d46e-41a3-8400-caf27b4f3757\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-th5c6" Dec 09 11:48:11 crc kubenswrapper[4745]: I1209 11:48:11.516012 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-th5c6" Dec 09 11:48:11 crc kubenswrapper[4745]: I1209 11:48:11.913079 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-th5c6"] Dec 09 11:48:12 crc kubenswrapper[4745]: I1209 11:48:12.252302 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-th5c6" event={"ID":"bb6481d3-d46e-41a3-8400-caf27b4f3757","Type":"ContainerStarted","Data":"ad98c1c1ed196ad82c44767f1b5e08b0921614c5d68adec3f6dcf48a5e251b90"} Dec 09 11:48:12 crc kubenswrapper[4745]: I1209 11:48:12.676762 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-9nvm2"] Dec 09 11:48:12 crc kubenswrapper[4745]: I1209 11:48:12.677995 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-9nvm2" Dec 09 11:48:12 crc kubenswrapper[4745]: I1209 11:48:12.683430 4745 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-v4m7v" Dec 09 11:48:12 crc kubenswrapper[4745]: I1209 11:48:12.687159 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-9nvm2"] Dec 09 11:48:12 crc kubenswrapper[4745]: I1209 11:48:12.850612 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d47rj\" (UniqueName: \"kubernetes.io/projected/7fa68a26-0e62-44ac-b591-b28d528fb71a-kube-api-access-d47rj\") pod \"cert-manager-cainjector-855d9ccff4-9nvm2\" (UID: \"7fa68a26-0e62-44ac-b591-b28d528fb71a\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-9nvm2" Dec 09 11:48:12 crc kubenswrapper[4745]: I1209 11:48:12.850739 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7fa68a26-0e62-44ac-b591-b28d528fb71a-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-9nvm2\" (UID: \"7fa68a26-0e62-44ac-b591-b28d528fb71a\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-9nvm2" Dec 09 11:48:12 crc kubenswrapper[4745]: I1209 11:48:12.953068 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7fa68a26-0e62-44ac-b591-b28d528fb71a-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-9nvm2\" (UID: \"7fa68a26-0e62-44ac-b591-b28d528fb71a\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-9nvm2" Dec 09 11:48:12 crc kubenswrapper[4745]: I1209 11:48:12.953156 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d47rj\" (UniqueName: \"kubernetes.io/projected/7fa68a26-0e62-44ac-b591-b28d528fb71a-kube-api-access-d47rj\") pod \"cert-manager-cainjector-855d9ccff4-9nvm2\" (UID: \"7fa68a26-0e62-44ac-b591-b28d528fb71a\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-9nvm2" Dec 09 11:48:12 crc kubenswrapper[4745]: I1209 11:48:12.973060 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d47rj\" (UniqueName: \"kubernetes.io/projected/7fa68a26-0e62-44ac-b591-b28d528fb71a-kube-api-access-d47rj\") pod \"cert-manager-cainjector-855d9ccff4-9nvm2\" (UID: \"7fa68a26-0e62-44ac-b591-b28d528fb71a\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-9nvm2" Dec 09 11:48:12 crc kubenswrapper[4745]: I1209 11:48:12.977112 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7fa68a26-0e62-44ac-b591-b28d528fb71a-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-9nvm2\" (UID: \"7fa68a26-0e62-44ac-b591-b28d528fb71a\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-9nvm2" Dec 09 11:48:13 crc kubenswrapper[4745]: I1209 11:48:13.051590 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-9nvm2" Dec 09 11:48:13 crc kubenswrapper[4745]: I1209 11:48:13.425480 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-9nvm2"] Dec 09 11:48:13 crc kubenswrapper[4745]: W1209 11:48:13.429882 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fa68a26_0e62_44ac_b591_b28d528fb71a.slice/crio-208424181ec67791ea808fc956d62197be1877dcdbd4014c8f9d7986303f9fc0 WatchSource:0}: Error finding container 208424181ec67791ea808fc956d62197be1877dcdbd4014c8f9d7986303f9fc0: Status 404 returned error can't find the container with id 208424181ec67791ea808fc956d62197be1877dcdbd4014c8f9d7986303f9fc0 Dec 09 11:48:14 crc kubenswrapper[4745]: I1209 11:48:14.282169 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-9nvm2" event={"ID":"7fa68a26-0e62-44ac-b591-b28d528fb71a","Type":"ContainerStarted","Data":"208424181ec67791ea808fc956d62197be1877dcdbd4014c8f9d7986303f9fc0"} Dec 09 11:48:26 crc kubenswrapper[4745]: I1209 11:48:26.378203 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-9nvm2" event={"ID":"7fa68a26-0e62-44ac-b591-b28d528fb71a","Type":"ContainerStarted","Data":"b2508127287443b8fcfcda878113f63e8368cd3b460fb34828f7f97df4b41d0d"} Dec 09 11:48:26 crc kubenswrapper[4745]: I1209 11:48:26.385250 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-th5c6" event={"ID":"bb6481d3-d46e-41a3-8400-caf27b4f3757","Type":"ContainerStarted","Data":"0c773861944c6238ce8be5776f1f55222ab9952692b77856812894d44fd9eda1"} Dec 09 11:48:26 crc kubenswrapper[4745]: I1209 11:48:26.385599 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-th5c6" Dec 09 11:48:26 crc kubenswrapper[4745]: I1209 11:48:26.413130 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-9nvm2" podStartSLOduration=2.479008433 podStartE2EDuration="14.413103926s" podCreationTimestamp="2025-12-09 11:48:12 +0000 UTC" firstStartedPulling="2025-12-09 11:48:13.431929079 +0000 UTC m=+980.257130593" lastFinishedPulling="2025-12-09 11:48:25.366024552 +0000 UTC m=+992.191226086" observedRunningTime="2025-12-09 11:48:26.405741308 +0000 UTC m=+993.230942842" watchObservedRunningTime="2025-12-09 11:48:26.413103926 +0000 UTC m=+993.238305450" Dec 09 11:48:26 crc kubenswrapper[4745]: I1209 11:48:26.440184 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-th5c6" podStartSLOduration=2.024656082 podStartE2EDuration="15.440156564s" podCreationTimestamp="2025-12-09 11:48:11 +0000 UTC" firstStartedPulling="2025-12-09 11:48:11.926703988 +0000 UTC m=+978.751905512" lastFinishedPulling="2025-12-09 11:48:25.34220447 +0000 UTC m=+992.167405994" observedRunningTime="2025-12-09 11:48:26.43330875 +0000 UTC m=+993.258510284" watchObservedRunningTime="2025-12-09 11:48:26.440156564 +0000 UTC m=+993.265358088" Dec 09 11:48:29 crc kubenswrapper[4745]: I1209 11:48:29.820276 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-mqgqq"] Dec 09 11:48:29 crc kubenswrapper[4745]: I1209 11:48:29.821338 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-mqgqq" Dec 09 11:48:29 crc kubenswrapper[4745]: I1209 11:48:29.823198 4745 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-gc9nc" Dec 09 11:48:29 crc kubenswrapper[4745]: I1209 11:48:29.829297 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-mqgqq"] Dec 09 11:48:30 crc kubenswrapper[4745]: I1209 11:48:30.013948 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb38d591-af06-4500-9ee6-8b336ce15761-bound-sa-token\") pod \"cert-manager-86cb77c54b-mqgqq\" (UID: \"bb38d591-af06-4500-9ee6-8b336ce15761\") " pod="cert-manager/cert-manager-86cb77c54b-mqgqq" Dec 09 11:48:30 crc kubenswrapper[4745]: I1209 11:48:30.014006 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9r55\" (UniqueName: \"kubernetes.io/projected/bb38d591-af06-4500-9ee6-8b336ce15761-kube-api-access-l9r55\") pod \"cert-manager-86cb77c54b-mqgqq\" (UID: \"bb38d591-af06-4500-9ee6-8b336ce15761\") " pod="cert-manager/cert-manager-86cb77c54b-mqgqq" Dec 09 11:48:30 crc kubenswrapper[4745]: I1209 11:48:30.115293 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb38d591-af06-4500-9ee6-8b336ce15761-bound-sa-token\") pod \"cert-manager-86cb77c54b-mqgqq\" (UID: \"bb38d591-af06-4500-9ee6-8b336ce15761\") " pod="cert-manager/cert-manager-86cb77c54b-mqgqq" Dec 09 11:48:30 crc kubenswrapper[4745]: I1209 11:48:30.115347 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9r55\" (UniqueName: \"kubernetes.io/projected/bb38d591-af06-4500-9ee6-8b336ce15761-kube-api-access-l9r55\") pod \"cert-manager-86cb77c54b-mqgqq\" (UID: \"bb38d591-af06-4500-9ee6-8b336ce15761\") " pod="cert-manager/cert-manager-86cb77c54b-mqgqq" Dec 09 11:48:30 crc kubenswrapper[4745]: I1209 11:48:30.134823 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb38d591-af06-4500-9ee6-8b336ce15761-bound-sa-token\") pod \"cert-manager-86cb77c54b-mqgqq\" (UID: \"bb38d591-af06-4500-9ee6-8b336ce15761\") " pod="cert-manager/cert-manager-86cb77c54b-mqgqq" Dec 09 11:48:30 crc kubenswrapper[4745]: I1209 11:48:30.135007 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9r55\" (UniqueName: \"kubernetes.io/projected/bb38d591-af06-4500-9ee6-8b336ce15761-kube-api-access-l9r55\") pod \"cert-manager-86cb77c54b-mqgqq\" (UID: \"bb38d591-af06-4500-9ee6-8b336ce15761\") " pod="cert-manager/cert-manager-86cb77c54b-mqgqq" Dec 09 11:48:30 crc kubenswrapper[4745]: I1209 11:48:30.141148 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-mqgqq" Dec 09 11:48:30 crc kubenswrapper[4745]: I1209 11:48:30.604910 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-mqgqq"] Dec 09 11:48:31 crc kubenswrapper[4745]: I1209 11:48:31.415854 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-mqgqq" event={"ID":"bb38d591-af06-4500-9ee6-8b336ce15761","Type":"ContainerStarted","Data":"ad2dd1c352e8395f6d536773543d406e8dfedf6efb1cf31a422991c3b074eb45"} Dec 09 11:48:31 crc kubenswrapper[4745]: I1209 11:48:31.416391 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-mqgqq" event={"ID":"bb38d591-af06-4500-9ee6-8b336ce15761","Type":"ContainerStarted","Data":"403d8a21c01a87cae1273be395b637ce47646bb9abbe6d94e6d908369bb6a3b7"} Dec 09 11:48:31 crc kubenswrapper[4745]: I1209 11:48:31.437109 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-mqgqq" podStartSLOduration=2.437083035 podStartE2EDuration="2.437083035s" podCreationTimestamp="2025-12-09 11:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:48:31.430570009 +0000 UTC m=+998.255771543" watchObservedRunningTime="2025-12-09 11:48:31.437083035 +0000 UTC m=+998.262284559" Dec 09 11:48:31 crc kubenswrapper[4745]: I1209 11:48:31.523764 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-th5c6" Dec 09 11:48:34 crc kubenswrapper[4745]: I1209 11:48:34.974193 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-2zrv9"] Dec 09 11:48:34 crc kubenswrapper[4745]: I1209 11:48:34.975290 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2zrv9" Dec 09 11:48:34 crc kubenswrapper[4745]: I1209 11:48:34.978500 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 09 11:48:34 crc kubenswrapper[4745]: I1209 11:48:34.978552 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-bth4l" Dec 09 11:48:34 crc kubenswrapper[4745]: I1209 11:48:34.980146 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 09 11:48:34 crc kubenswrapper[4745]: I1209 11:48:34.995094 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2zrv9"] Dec 09 11:48:35 crc kubenswrapper[4745]: I1209 11:48:35.082032 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf52s\" (UniqueName: \"kubernetes.io/projected/be4f4809-4055-4b96-b89a-7e675ecea6ef-kube-api-access-gf52s\") pod \"openstack-operator-index-2zrv9\" (UID: \"be4f4809-4055-4b96-b89a-7e675ecea6ef\") " pod="openstack-operators/openstack-operator-index-2zrv9" Dec 09 11:48:35 crc kubenswrapper[4745]: I1209 11:48:35.183805 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf52s\" (UniqueName: \"kubernetes.io/projected/be4f4809-4055-4b96-b89a-7e675ecea6ef-kube-api-access-gf52s\") pod \"openstack-operator-index-2zrv9\" (UID: \"be4f4809-4055-4b96-b89a-7e675ecea6ef\") " pod="openstack-operators/openstack-operator-index-2zrv9" Dec 09 11:48:35 crc kubenswrapper[4745]: I1209 11:48:35.207265 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf52s\" (UniqueName: \"kubernetes.io/projected/be4f4809-4055-4b96-b89a-7e675ecea6ef-kube-api-access-gf52s\") pod \"openstack-operator-index-2zrv9\" (UID: \"be4f4809-4055-4b96-b89a-7e675ecea6ef\") " pod="openstack-operators/openstack-operator-index-2zrv9" Dec 09 11:48:35 crc kubenswrapper[4745]: I1209 11:48:35.293736 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2zrv9" Dec 09 11:48:35 crc kubenswrapper[4745]: I1209 11:48:35.698706 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2zrv9"] Dec 09 11:48:35 crc kubenswrapper[4745]: W1209 11:48:35.706425 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe4f4809_4055_4b96_b89a_7e675ecea6ef.slice/crio-1a5f194bb2d03f32f9f0315f2a957cbd903dcc2c6e8e4c6fc7b4bd09aca2f144 WatchSource:0}: Error finding container 1a5f194bb2d03f32f9f0315f2a957cbd903dcc2c6e8e4c6fc7b4bd09aca2f144: Status 404 returned error can't find the container with id 1a5f194bb2d03f32f9f0315f2a957cbd903dcc2c6e8e4c6fc7b4bd09aca2f144 Dec 09 11:48:36 crc kubenswrapper[4745]: I1209 11:48:36.450380 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2zrv9" event={"ID":"be4f4809-4055-4b96-b89a-7e675ecea6ef","Type":"ContainerStarted","Data":"1a5f194bb2d03f32f9f0315f2a957cbd903dcc2c6e8e4c6fc7b4bd09aca2f144"} Dec 09 11:48:37 crc kubenswrapper[4745]: I1209 11:48:37.457122 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2zrv9" event={"ID":"be4f4809-4055-4b96-b89a-7e675ecea6ef","Type":"ContainerStarted","Data":"b704f308563f88c88c3a9f68bcf045b46d25aca29c0e925ac5102d5c554e35c0"} Dec 09 11:48:37 crc kubenswrapper[4745]: I1209 11:48:37.473679 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-2zrv9" podStartSLOduration=2.15160121 podStartE2EDuration="3.473659499s" podCreationTimestamp="2025-12-09 11:48:34 +0000 UTC" firstStartedPulling="2025-12-09 11:48:35.708829257 +0000 UTC m=+1002.534030781" lastFinishedPulling="2025-12-09 11:48:37.030887546 +0000 UTC m=+1003.856089070" observedRunningTime="2025-12-09 11:48:37.4703745 +0000 UTC m=+1004.295576034" watchObservedRunningTime="2025-12-09 11:48:37.473659499 +0000 UTC m=+1004.298861023" Dec 09 11:48:38 crc kubenswrapper[4745]: I1209 11:48:38.548805 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-2zrv9"] Dec 09 11:48:39 crc kubenswrapper[4745]: I1209 11:48:39.161345 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-klw9s"] Dec 09 11:48:39 crc kubenswrapper[4745]: I1209 11:48:39.162496 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-klw9s" Dec 09 11:48:39 crc kubenswrapper[4745]: I1209 11:48:39.166663 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-klw9s"] Dec 09 11:48:39 crc kubenswrapper[4745]: I1209 11:48:39.342864 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz7hs\" (UniqueName: \"kubernetes.io/projected/58788af8-97b1-4820-82dc-9ced93c8d7ce-kube-api-access-kz7hs\") pod \"openstack-operator-index-klw9s\" (UID: \"58788af8-97b1-4820-82dc-9ced93c8d7ce\") " pod="openstack-operators/openstack-operator-index-klw9s" Dec 09 11:48:39 crc kubenswrapper[4745]: I1209 11:48:39.444916 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz7hs\" (UniqueName: \"kubernetes.io/projected/58788af8-97b1-4820-82dc-9ced93c8d7ce-kube-api-access-kz7hs\") pod \"openstack-operator-index-klw9s\" (UID: \"58788af8-97b1-4820-82dc-9ced93c8d7ce\") " pod="openstack-operators/openstack-operator-index-klw9s" Dec 09 11:48:39 crc kubenswrapper[4745]: I1209 11:48:39.465611 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz7hs\" (UniqueName: \"kubernetes.io/projected/58788af8-97b1-4820-82dc-9ced93c8d7ce-kube-api-access-kz7hs\") pod \"openstack-operator-index-klw9s\" (UID: \"58788af8-97b1-4820-82dc-9ced93c8d7ce\") " pod="openstack-operators/openstack-operator-index-klw9s" Dec 09 11:48:39 crc kubenswrapper[4745]: I1209 11:48:39.470553 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-2zrv9" podUID="be4f4809-4055-4b96-b89a-7e675ecea6ef" containerName="registry-server" containerID="cri-o://b704f308563f88c88c3a9f68bcf045b46d25aca29c0e925ac5102d5c554e35c0" gracePeriod=2 Dec 09 11:48:39 crc kubenswrapper[4745]: I1209 11:48:39.478820 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-klw9s" Dec 09 11:48:39 crc kubenswrapper[4745]: I1209 11:48:39.909537 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-klw9s"] Dec 09 11:48:39 crc kubenswrapper[4745]: W1209 11:48:39.912454 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58788af8_97b1_4820_82dc_9ced93c8d7ce.slice/crio-62a7d4dac4a3e3f58c884e28b731f8f36a625ea9e6c81a18da4f1fe49d36cf59 WatchSource:0}: Error finding container 62a7d4dac4a3e3f58c884e28b731f8f36a625ea9e6c81a18da4f1fe49d36cf59: Status 404 returned error can't find the container with id 62a7d4dac4a3e3f58c884e28b731f8f36a625ea9e6c81a18da4f1fe49d36cf59 Dec 09 11:48:40 crc kubenswrapper[4745]: I1209 11:48:40.479335 4745 generic.go:334] "Generic (PLEG): container finished" podID="be4f4809-4055-4b96-b89a-7e675ecea6ef" containerID="b704f308563f88c88c3a9f68bcf045b46d25aca29c0e925ac5102d5c554e35c0" exitCode=0 Dec 09 11:48:40 crc kubenswrapper[4745]: I1209 11:48:40.479453 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2zrv9" event={"ID":"be4f4809-4055-4b96-b89a-7e675ecea6ef","Type":"ContainerDied","Data":"b704f308563f88c88c3a9f68bcf045b46d25aca29c0e925ac5102d5c554e35c0"} Dec 09 11:48:40 crc kubenswrapper[4745]: I1209 11:48:40.481278 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-klw9s" event={"ID":"58788af8-97b1-4820-82dc-9ced93c8d7ce","Type":"ContainerStarted","Data":"b9e278df3fb79130fdbb0555948cf99e1690bf0ac7cc4ef704a348ac1e4f8cd4"} Dec 09 11:48:40 crc kubenswrapper[4745]: I1209 11:48:40.481314 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-klw9s" event={"ID":"58788af8-97b1-4820-82dc-9ced93c8d7ce","Type":"ContainerStarted","Data":"62a7d4dac4a3e3f58c884e28b731f8f36a625ea9e6c81a18da4f1fe49d36cf59"} Dec 09 11:48:40 crc kubenswrapper[4745]: I1209 11:48:40.504002 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-klw9s" podStartSLOduration=1.127184139 podStartE2EDuration="1.503980915s" podCreationTimestamp="2025-12-09 11:48:39 +0000 UTC" firstStartedPulling="2025-12-09 11:48:39.91665709 +0000 UTC m=+1006.741858614" lastFinishedPulling="2025-12-09 11:48:40.293453866 +0000 UTC m=+1007.118655390" observedRunningTime="2025-12-09 11:48:40.498783225 +0000 UTC m=+1007.323984749" watchObservedRunningTime="2025-12-09 11:48:40.503980915 +0000 UTC m=+1007.329182439" Dec 09 11:48:40 crc kubenswrapper[4745]: I1209 11:48:40.951059 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2zrv9" Dec 09 11:48:41 crc kubenswrapper[4745]: I1209 11:48:41.070286 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf52s\" (UniqueName: \"kubernetes.io/projected/be4f4809-4055-4b96-b89a-7e675ecea6ef-kube-api-access-gf52s\") pod \"be4f4809-4055-4b96-b89a-7e675ecea6ef\" (UID: \"be4f4809-4055-4b96-b89a-7e675ecea6ef\") " Dec 09 11:48:41 crc kubenswrapper[4745]: I1209 11:48:41.075651 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be4f4809-4055-4b96-b89a-7e675ecea6ef-kube-api-access-gf52s" (OuterVolumeSpecName: "kube-api-access-gf52s") pod "be4f4809-4055-4b96-b89a-7e675ecea6ef" (UID: "be4f4809-4055-4b96-b89a-7e675ecea6ef"). InnerVolumeSpecName "kube-api-access-gf52s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:48:41 crc kubenswrapper[4745]: I1209 11:48:41.172089 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf52s\" (UniqueName: \"kubernetes.io/projected/be4f4809-4055-4b96-b89a-7e675ecea6ef-kube-api-access-gf52s\") on node \"crc\" DevicePath \"\"" Dec 09 11:48:41 crc kubenswrapper[4745]: I1209 11:48:41.488042 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2zrv9" Dec 09 11:48:41 crc kubenswrapper[4745]: I1209 11:48:41.488066 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2zrv9" event={"ID":"be4f4809-4055-4b96-b89a-7e675ecea6ef","Type":"ContainerDied","Data":"1a5f194bb2d03f32f9f0315f2a957cbd903dcc2c6e8e4c6fc7b4bd09aca2f144"} Dec 09 11:48:41 crc kubenswrapper[4745]: I1209 11:48:41.488459 4745 scope.go:117] "RemoveContainer" containerID="b704f308563f88c88c3a9f68bcf045b46d25aca29c0e925ac5102d5c554e35c0" Dec 09 11:48:41 crc kubenswrapper[4745]: I1209 11:48:41.514746 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-2zrv9"] Dec 09 11:48:41 crc kubenswrapper[4745]: I1209 11:48:41.519040 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-2zrv9"] Dec 09 11:48:41 crc kubenswrapper[4745]: I1209 11:48:41.565391 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be4f4809-4055-4b96-b89a-7e675ecea6ef" path="/var/lib/kubelet/pods/be4f4809-4055-4b96-b89a-7e675ecea6ef/volumes" Dec 09 11:48:49 crc kubenswrapper[4745]: I1209 11:48:49.479599 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-klw9s" Dec 09 11:48:49 crc kubenswrapper[4745]: I1209 11:48:49.480357 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-klw9s" Dec 09 11:48:49 crc kubenswrapper[4745]: I1209 11:48:49.509874 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-klw9s" Dec 09 11:48:49 crc kubenswrapper[4745]: I1209 11:48:49.581373 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-klw9s" Dec 09 11:48:50 crc kubenswrapper[4745]: I1209 11:48:50.993262 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9"] Dec 09 11:48:50 crc kubenswrapper[4745]: E1209 11:48:50.993549 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4f4809-4055-4b96-b89a-7e675ecea6ef" containerName="registry-server" Dec 09 11:48:50 crc kubenswrapper[4745]: I1209 11:48:50.993561 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4f4809-4055-4b96-b89a-7e675ecea6ef" containerName="registry-server" Dec 09 11:48:50 crc kubenswrapper[4745]: I1209 11:48:50.993676 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="be4f4809-4055-4b96-b89a-7e675ecea6ef" containerName="registry-server" Dec 09 11:48:50 crc kubenswrapper[4745]: I1209 11:48:50.994464 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" Dec 09 11:48:50 crc kubenswrapper[4745]: I1209 11:48:50.996638 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-lqgfg" Dec 09 11:48:51 crc kubenswrapper[4745]: I1209 11:48:51.005762 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9"] Dec 09 11:48:51 crc kubenswrapper[4745]: I1209 11:48:51.023589 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-bundle\") pod \"a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9\" (UID: \"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3\") " pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" Dec 09 11:48:51 crc kubenswrapper[4745]: I1209 11:48:51.023644 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-util\") pod \"a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9\" (UID: \"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3\") " pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" Dec 09 11:48:51 crc kubenswrapper[4745]: I1209 11:48:51.023815 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74p7h\" (UniqueName: \"kubernetes.io/projected/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-kube-api-access-74p7h\") pod \"a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9\" (UID: \"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3\") " pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" Dec 09 11:48:51 crc kubenswrapper[4745]: I1209 11:48:51.124843 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74p7h\" (UniqueName: \"kubernetes.io/projected/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-kube-api-access-74p7h\") pod \"a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9\" (UID: \"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3\") " pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" Dec 09 11:48:51 crc kubenswrapper[4745]: I1209 11:48:51.124923 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-bundle\") pod \"a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9\" (UID: \"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3\") " pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" Dec 09 11:48:51 crc kubenswrapper[4745]: I1209 11:48:51.124962 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-util\") pod \"a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9\" (UID: \"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3\") " pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" Dec 09 11:48:51 crc kubenswrapper[4745]: I1209 11:48:51.125452 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-util\") pod \"a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9\" (UID: \"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3\") " pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" Dec 09 11:48:51 crc kubenswrapper[4745]: I1209 11:48:51.125620 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-bundle\") pod \"a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9\" (UID: \"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3\") " pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" Dec 09 11:48:51 crc kubenswrapper[4745]: I1209 11:48:51.145013 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74p7h\" (UniqueName: \"kubernetes.io/projected/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-kube-api-access-74p7h\") pod \"a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9\" (UID: \"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3\") " pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" Dec 09 11:48:51 crc kubenswrapper[4745]: I1209 11:48:51.310589 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" Dec 09 11:48:51 crc kubenswrapper[4745]: I1209 11:48:51.719925 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9"] Dec 09 11:48:51 crc kubenswrapper[4745]: W1209 11:48:51.728425 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40505d15_7c11_47d5_b8f6_a5a1cc5b8aa3.slice/crio-da1f01749e309044ede4e0c835aa069f766cdd7d194f746520afd0bc9a890e73 WatchSource:0}: Error finding container da1f01749e309044ede4e0c835aa069f766cdd7d194f746520afd0bc9a890e73: Status 404 returned error can't find the container with id da1f01749e309044ede4e0c835aa069f766cdd7d194f746520afd0bc9a890e73 Dec 09 11:48:52 crc kubenswrapper[4745]: I1209 11:48:52.574181 4745 generic.go:334] "Generic (PLEG): container finished" podID="40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3" containerID="581a4ba5c401e5a8cb8ca75758c579a15c54bf6842e226d351c47f303cb4d4ec" exitCode=0 Dec 09 11:48:52 crc kubenswrapper[4745]: I1209 11:48:52.574230 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" event={"ID":"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3","Type":"ContainerDied","Data":"581a4ba5c401e5a8cb8ca75758c579a15c54bf6842e226d351c47f303cb4d4ec"} Dec 09 11:48:52 crc kubenswrapper[4745]: I1209 11:48:52.574445 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" event={"ID":"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3","Type":"ContainerStarted","Data":"da1f01749e309044ede4e0c835aa069f766cdd7d194f746520afd0bc9a890e73"} Dec 09 11:48:53 crc kubenswrapper[4745]: I1209 11:48:53.582008 4745 generic.go:334] "Generic (PLEG): container finished" podID="40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3" containerID="93d1969f1e904fc7fe0c0ff64fd4c51a76d816647c389d68d2fa4df37bb07f79" exitCode=0 Dec 09 11:48:53 crc kubenswrapper[4745]: I1209 11:48:53.582119 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" event={"ID":"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3","Type":"ContainerDied","Data":"93d1969f1e904fc7fe0c0ff64fd4c51a76d816647c389d68d2fa4df37bb07f79"} Dec 09 11:48:55 crc kubenswrapper[4745]: I1209 11:48:55.597717 4745 generic.go:334] "Generic (PLEG): container finished" podID="40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3" containerID="549a099d73dc0d65a6adfba71f328d8b73401bd28fbdbe8f47e4c75b8cdfc38a" exitCode=0 Dec 09 11:48:55 crc kubenswrapper[4745]: I1209 11:48:55.598288 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" event={"ID":"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3","Type":"ContainerDied","Data":"549a099d73dc0d65a6adfba71f328d8b73401bd28fbdbe8f47e4c75b8cdfc38a"} Dec 09 11:48:56 crc kubenswrapper[4745]: I1209 11:48:56.873604 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" Dec 09 11:48:56 crc kubenswrapper[4745]: I1209 11:48:56.985139 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-util\") pod \"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3\" (UID: \"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3\") " Dec 09 11:48:56 crc kubenswrapper[4745]: I1209 11:48:56.985215 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74p7h\" (UniqueName: \"kubernetes.io/projected/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-kube-api-access-74p7h\") pod \"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3\" (UID: \"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3\") " Dec 09 11:48:56 crc kubenswrapper[4745]: I1209 11:48:56.985283 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-bundle\") pod \"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3\" (UID: \"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3\") " Dec 09 11:48:56 crc kubenswrapper[4745]: I1209 11:48:56.986282 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-bundle" (OuterVolumeSpecName: "bundle") pod "40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3" (UID: "40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:48:56 crc kubenswrapper[4745]: I1209 11:48:56.992931 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-kube-api-access-74p7h" (OuterVolumeSpecName: "kube-api-access-74p7h") pod "40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3" (UID: "40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3"). InnerVolumeSpecName "kube-api-access-74p7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:48:56 crc kubenswrapper[4745]: I1209 11:48:56.999116 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-util" (OuterVolumeSpecName: "util") pod "40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3" (UID: "40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:48:57 crc kubenswrapper[4745]: I1209 11:48:57.087237 4745 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-util\") on node \"crc\" DevicePath \"\"" Dec 09 11:48:57 crc kubenswrapper[4745]: I1209 11:48:57.087298 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74p7h\" (UniqueName: \"kubernetes.io/projected/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-kube-api-access-74p7h\") on node \"crc\" DevicePath \"\"" Dec 09 11:48:57 crc kubenswrapper[4745]: I1209 11:48:57.087311 4745 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:48:57 crc kubenswrapper[4745]: I1209 11:48:57.615246 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" event={"ID":"40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3","Type":"ContainerDied","Data":"da1f01749e309044ede4e0c835aa069f766cdd7d194f746520afd0bc9a890e73"} Dec 09 11:48:57 crc kubenswrapper[4745]: I1209 11:48:57.615293 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da1f01749e309044ede4e0c835aa069f766cdd7d194f746520afd0bc9a890e73" Dec 09 11:48:57 crc kubenswrapper[4745]: I1209 11:48:57.615313 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9" Dec 09 11:49:04 crc kubenswrapper[4745]: I1209 11:49:04.218984 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7979d445b4-rz2zf"] Dec 09 11:49:04 crc kubenswrapper[4745]: E1209 11:49:04.219749 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3" containerName="pull" Dec 09 11:49:04 crc kubenswrapper[4745]: I1209 11:49:04.219766 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3" containerName="pull" Dec 09 11:49:04 crc kubenswrapper[4745]: E1209 11:49:04.219782 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3" containerName="util" Dec 09 11:49:04 crc kubenswrapper[4745]: I1209 11:49:04.219790 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3" containerName="util" Dec 09 11:49:04 crc kubenswrapper[4745]: E1209 11:49:04.219812 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3" containerName="extract" Dec 09 11:49:04 crc kubenswrapper[4745]: I1209 11:49:04.219822 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3" containerName="extract" Dec 09 11:49:04 crc kubenswrapper[4745]: I1209 11:49:04.220037 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3" containerName="extract" Dec 09 11:49:04 crc kubenswrapper[4745]: I1209 11:49:04.220578 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7979d445b4-rz2zf" Dec 09 11:49:04 crc kubenswrapper[4745]: I1209 11:49:04.225042 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-wwgc5" Dec 09 11:49:04 crc kubenswrapper[4745]: I1209 11:49:04.252159 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7979d445b4-rz2zf"] Dec 09 11:49:04 crc kubenswrapper[4745]: I1209 11:49:04.390115 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qvd2\" (UniqueName: \"kubernetes.io/projected/5c31507c-4f73-4d7d-85b5-45bd562ca0f3-kube-api-access-4qvd2\") pod \"openstack-operator-controller-operator-7979d445b4-rz2zf\" (UID: \"5c31507c-4f73-4d7d-85b5-45bd562ca0f3\") " pod="openstack-operators/openstack-operator-controller-operator-7979d445b4-rz2zf" Dec 09 11:49:04 crc kubenswrapper[4745]: I1209 11:49:04.492034 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qvd2\" (UniqueName: \"kubernetes.io/projected/5c31507c-4f73-4d7d-85b5-45bd562ca0f3-kube-api-access-4qvd2\") pod \"openstack-operator-controller-operator-7979d445b4-rz2zf\" (UID: \"5c31507c-4f73-4d7d-85b5-45bd562ca0f3\") " pod="openstack-operators/openstack-operator-controller-operator-7979d445b4-rz2zf" Dec 09 11:49:04 crc kubenswrapper[4745]: I1209 11:49:04.515321 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qvd2\" (UniqueName: \"kubernetes.io/projected/5c31507c-4f73-4d7d-85b5-45bd562ca0f3-kube-api-access-4qvd2\") pod \"openstack-operator-controller-operator-7979d445b4-rz2zf\" (UID: \"5c31507c-4f73-4d7d-85b5-45bd562ca0f3\") " pod="openstack-operators/openstack-operator-controller-operator-7979d445b4-rz2zf" Dec 09 11:49:04 crc kubenswrapper[4745]: I1209 11:49:04.546940 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7979d445b4-rz2zf" Dec 09 11:49:05 crc kubenswrapper[4745]: I1209 11:49:05.087888 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7979d445b4-rz2zf"] Dec 09 11:49:05 crc kubenswrapper[4745]: I1209 11:49:05.669071 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7979d445b4-rz2zf" event={"ID":"5c31507c-4f73-4d7d-85b5-45bd562ca0f3","Type":"ContainerStarted","Data":"e8540b650c75252ba3b25661917b88d944d3676d6be731545991b11b909aa86d"} Dec 09 11:49:13 crc kubenswrapper[4745]: I1209 11:49:13.926721 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7979d445b4-rz2zf" event={"ID":"5c31507c-4f73-4d7d-85b5-45bd562ca0f3","Type":"ContainerStarted","Data":"3c542a6e94ac03cd92ebf995aff9085b9ab8e1fbfe05a4e1c1edd5f43aaca7cb"} Dec 09 11:49:13 crc kubenswrapper[4745]: I1209 11:49:13.927307 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7979d445b4-rz2zf" Dec 09 11:49:13 crc kubenswrapper[4745]: I1209 11:49:13.956062 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7979d445b4-rz2zf" podStartSLOduration=2.308264566 podStartE2EDuration="9.956040175s" podCreationTimestamp="2025-12-09 11:49:04 +0000 UTC" firstStartedPulling="2025-12-09 11:49:05.096866797 +0000 UTC m=+1031.922068321" lastFinishedPulling="2025-12-09 11:49:12.744642406 +0000 UTC m=+1039.569843930" observedRunningTime="2025-12-09 11:49:13.949877809 +0000 UTC m=+1040.775079383" watchObservedRunningTime="2025-12-09 11:49:13.956040175 +0000 UTC m=+1040.781241699" Dec 09 11:49:24 crc kubenswrapper[4745]: I1209 11:49:24.550082 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7979d445b4-rz2zf" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.463896 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-5wtvn"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.466093 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5wtvn" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.470316 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-qdctp" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.477467 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-5wtvn"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.487350 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-8h2wf"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.493846 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-8h2wf" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.499906 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-f7bk4" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.516437 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-6vwcq"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.518017 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-6vwcq" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.520633 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-hf7gm" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.528091 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-8h2wf"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.539978 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-bngxs"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.541224 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-bngxs" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.545156 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-9dfsd" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.556563 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-6vwcq"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.578829 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-bngxs"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.608800 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ljgwk"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.609772 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ljgwk" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.619242 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xwbh2" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.636017 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.637364 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.647278 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9x9kw" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.653339 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ljgwk"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.659280 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvqhw\" (UniqueName: \"kubernetes.io/projected/a8ec2c42-ff99-42c1-b0bb-e362207f4e3e-kube-api-access-dvqhw\") pod \"barbican-operator-controller-manager-7d9dfd778-5wtvn\" (UID: \"a8ec2c42-ff99-42c1-b0bb-e362207f4e3e\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5wtvn" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.659321 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcpzr\" (UniqueName: \"kubernetes.io/projected/668cdfd0-6b54-4750-b58f-97b26180b203-kube-api-access-fcpzr\") pod \"glance-operator-controller-manager-5697bb5779-bngxs\" (UID: \"668cdfd0-6b54-4750-b58f-97b26180b203\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-bngxs" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.659344 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvndq\" (UniqueName: \"kubernetes.io/projected/ae7706be-4789-4153-adf4-9abb8e4ee8d8-kube-api-access-lvndq\") pod \"cinder-operator-controller-manager-6c677c69b-8h2wf\" (UID: \"ae7706be-4789-4153-adf4-9abb8e4ee8d8\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-8h2wf" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.659383 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2wwh\" (UniqueName: \"kubernetes.io/projected/faa047d4-ea12-493a-aa0a-b429e0c5a123-kube-api-access-z2wwh\") pod \"horizon-operator-controller-manager-68c6d99b8f-h44ls\" (UID: \"faa047d4-ea12-493a-aa0a-b429e0c5a123\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.659401 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqg6c\" (UniqueName: \"kubernetes.io/projected/42df7b5d-8be1-4670-b063-e83cf65b1dae-kube-api-access-vqg6c\") pod \"designate-operator-controller-manager-697fb699cf-6vwcq\" (UID: \"42df7b5d-8be1-4670-b063-e83cf65b1dae\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-6vwcq" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.659433 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8hzq\" (UniqueName: \"kubernetes.io/projected/b026a92d-e1be-43c9-8e7a-1f66260bab18-kube-api-access-s8hzq\") pod \"heat-operator-controller-manager-5f64f6f8bb-ljgwk\" (UID: \"b026a92d-e1be-43c9-8e7a-1f66260bab18\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ljgwk" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.665581 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.666826 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.674609 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.675117 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-j49xb" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.680749 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.690962 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.708566 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-mmpfw"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.709953 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-mmpfw" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.717089 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-pjvdd" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.729565 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-mmpfw"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.740253 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.741807 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.745635 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-txf9v" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.752598 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-dxc2l"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.754146 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-dxc2l" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.756478 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-wfs8t" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.760938 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvqhw\" (UniqueName: \"kubernetes.io/projected/a8ec2c42-ff99-42c1-b0bb-e362207f4e3e-kube-api-access-dvqhw\") pod \"barbican-operator-controller-manager-7d9dfd778-5wtvn\" (UID: \"a8ec2c42-ff99-42c1-b0bb-e362207f4e3e\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5wtvn" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.761005 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcpzr\" (UniqueName: \"kubernetes.io/projected/668cdfd0-6b54-4750-b58f-97b26180b203-kube-api-access-fcpzr\") pod \"glance-operator-controller-manager-5697bb5779-bngxs\" (UID: \"668cdfd0-6b54-4750-b58f-97b26180b203\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-bngxs" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.761028 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvndq\" (UniqueName: \"kubernetes.io/projected/ae7706be-4789-4153-adf4-9abb8e4ee8d8-kube-api-access-lvndq\") pod \"cinder-operator-controller-manager-6c677c69b-8h2wf\" (UID: \"ae7706be-4789-4153-adf4-9abb8e4ee8d8\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-8h2wf" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.761061 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2wwh\" (UniqueName: \"kubernetes.io/projected/faa047d4-ea12-493a-aa0a-b429e0c5a123-kube-api-access-z2wwh\") pod \"horizon-operator-controller-manager-68c6d99b8f-h44ls\" (UID: \"faa047d4-ea12-493a-aa0a-b429e0c5a123\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.761115 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqg6c\" (UniqueName: \"kubernetes.io/projected/42df7b5d-8be1-4670-b063-e83cf65b1dae-kube-api-access-vqg6c\") pod \"designate-operator-controller-manager-697fb699cf-6vwcq\" (UID: \"42df7b5d-8be1-4670-b063-e83cf65b1dae\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-6vwcq" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.761168 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8hzq\" (UniqueName: \"kubernetes.io/projected/b026a92d-e1be-43c9-8e7a-1f66260bab18-kube-api-access-s8hzq\") pod \"heat-operator-controller-manager-5f64f6f8bb-ljgwk\" (UID: \"b026a92d-e1be-43c9-8e7a-1f66260bab18\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ljgwk" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.833696 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvqhw\" (UniqueName: \"kubernetes.io/projected/a8ec2c42-ff99-42c1-b0bb-e362207f4e3e-kube-api-access-dvqhw\") pod \"barbican-operator-controller-manager-7d9dfd778-5wtvn\" (UID: \"a8ec2c42-ff99-42c1-b0bb-e362207f4e3e\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5wtvn" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.834302 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqg6c\" (UniqueName: \"kubernetes.io/projected/42df7b5d-8be1-4670-b063-e83cf65b1dae-kube-api-access-vqg6c\") pod \"designate-operator-controller-manager-697fb699cf-6vwcq\" (UID: \"42df7b5d-8be1-4670-b063-e83cf65b1dae\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-6vwcq" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.838144 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.838255 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvndq\" (UniqueName: \"kubernetes.io/projected/ae7706be-4789-4153-adf4-9abb8e4ee8d8-kube-api-access-lvndq\") pod \"cinder-operator-controller-manager-6c677c69b-8h2wf\" (UID: \"ae7706be-4789-4153-adf4-9abb8e4ee8d8\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-8h2wf" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.840432 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2wwh\" (UniqueName: \"kubernetes.io/projected/faa047d4-ea12-493a-aa0a-b429e0c5a123-kube-api-access-z2wwh\") pod \"horizon-operator-controller-manager-68c6d99b8f-h44ls\" (UID: \"faa047d4-ea12-493a-aa0a-b429e0c5a123\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.842020 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcpzr\" (UniqueName: \"kubernetes.io/projected/668cdfd0-6b54-4750-b58f-97b26180b203-kube-api-access-fcpzr\") pod \"glance-operator-controller-manager-5697bb5779-bngxs\" (UID: \"668cdfd0-6b54-4750-b58f-97b26180b203\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-bngxs" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.844005 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8hzq\" (UniqueName: \"kubernetes.io/projected/b026a92d-e1be-43c9-8e7a-1f66260bab18-kube-api-access-s8hzq\") pod \"heat-operator-controller-manager-5f64f6f8bb-ljgwk\" (UID: \"b026a92d-e1be-43c9-8e7a-1f66260bab18\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ljgwk" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.853492 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-6vwcq" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.865162 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert\") pod \"infra-operator-controller-manager-78d48bff9d-d2ksl\" (UID: \"6368805c-1aab-4425-b599-671f89f30110\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.865241 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n5nk\" (UniqueName: \"kubernetes.io/projected/641a4ff2-1fa2-4402-a313-d27dcc9c4294-kube-api-access-2n5nk\") pod \"keystone-operator-controller-manager-7765d96ddf-cxl42\" (UID: \"641a4ff2-1fa2-4402-a313-d27dcc9c4294\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.865311 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n5vp\" (UniqueName: \"kubernetes.io/projected/62cbd8ba-a7e1-40ed-9e2c-8553a84ea6aa-kube-api-access-6n5vp\") pod \"ironic-operator-controller-manager-967d97867-mmpfw\" (UID: \"62cbd8ba-a7e1-40ed-9e2c-8553a84ea6aa\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-mmpfw" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.865370 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rwgz\" (UniqueName: \"kubernetes.io/projected/8b171fb8-d84f-4b45-bbd9-8cf4dfff0a4d-kube-api-access-7rwgz\") pod \"manila-operator-controller-manager-5b5fd79c9c-dxc2l\" (UID: \"8b171fb8-d84f-4b45-bbd9-8cf4dfff0a4d\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-dxc2l" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.865425 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfzxm\" (UniqueName: \"kubernetes.io/projected/6368805c-1aab-4425-b599-671f89f30110-kube-api-access-vfzxm\") pod \"infra-operator-controller-manager-78d48bff9d-d2ksl\" (UID: \"6368805c-1aab-4425-b599-671f89f30110\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.870840 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-dxc2l"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.908126 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-bngxs" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.919555 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-sch7g"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.920770 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-sch7g" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.924817 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-sch7g"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.936504 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-sgdn2" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.936588 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ljgwk" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.946794 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-85z5r"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.949001 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-85z5r" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.952761 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-85z5r"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.960591 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-662j9"] Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.961572 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-8qcrd" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.961696 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.962173 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-662j9" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.964969 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-28n4v" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.966118 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfzxm\" (UniqueName: \"kubernetes.io/projected/6368805c-1aab-4425-b599-671f89f30110-kube-api-access-vfzxm\") pod \"infra-operator-controller-manager-78d48bff9d-d2ksl\" (UID: \"6368805c-1aab-4425-b599-671f89f30110\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.966167 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvrsv\" (UniqueName: \"kubernetes.io/projected/6354cdc2-296f-4479-a46e-9f2be37c4eef-kube-api-access-fvrsv\") pod \"nova-operator-controller-manager-697bc559fc-662j9\" (UID: \"6354cdc2-296f-4479-a46e-9f2be37c4eef\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-662j9" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.966192 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvn6k\" (UniqueName: \"kubernetes.io/projected/336faca3-d04c-4ab5-b5dd-d9f031c80c64-kube-api-access-cvn6k\") pod \"mariadb-operator-controller-manager-79c8c4686c-sch7g\" (UID: \"336faca3-d04c-4ab5-b5dd-d9f031c80c64\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-sch7g" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.966215 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfldx\" (UniqueName: \"kubernetes.io/projected/cbb6789a-1426-4ea2-aa2b-76959271ffc2-kube-api-access-cfldx\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-85z5r\" (UID: \"cbb6789a-1426-4ea2-aa2b-76959271ffc2\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-85z5r" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.966243 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert\") pod \"infra-operator-controller-manager-78d48bff9d-d2ksl\" (UID: \"6368805c-1aab-4425-b599-671f89f30110\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.966262 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n5nk\" (UniqueName: \"kubernetes.io/projected/641a4ff2-1fa2-4402-a313-d27dcc9c4294-kube-api-access-2n5nk\") pod \"keystone-operator-controller-manager-7765d96ddf-cxl42\" (UID: \"641a4ff2-1fa2-4402-a313-d27dcc9c4294\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.966296 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n5vp\" (UniqueName: \"kubernetes.io/projected/62cbd8ba-a7e1-40ed-9e2c-8553a84ea6aa-kube-api-access-6n5vp\") pod \"ironic-operator-controller-manager-967d97867-mmpfw\" (UID: \"62cbd8ba-a7e1-40ed-9e2c-8553a84ea6aa\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-mmpfw" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.966319 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rwgz\" (UniqueName: \"kubernetes.io/projected/8b171fb8-d84f-4b45-bbd9-8cf4dfff0a4d-kube-api-access-7rwgz\") pod \"manila-operator-controller-manager-5b5fd79c9c-dxc2l\" (UID: \"8b171fb8-d84f-4b45-bbd9-8cf4dfff0a4d\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-dxc2l" Dec 09 11:49:45 crc kubenswrapper[4745]: E1209 11:49:45.966673 4745 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 11:49:45 crc kubenswrapper[4745]: E1209 11:49:45.966736 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert podName:6368805c-1aab-4425-b599-671f89f30110 nodeName:}" failed. No retries permitted until 2025-12-09 11:49:46.466714744 +0000 UTC m=+1073.291916268 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert") pod "infra-operator-controller-manager-78d48bff9d-d2ksl" (UID: "6368805c-1aab-4425-b599-671f89f30110") : secret "infra-operator-webhook-server-cert" not found Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.996729 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rwgz\" (UniqueName: \"kubernetes.io/projected/8b171fb8-d84f-4b45-bbd9-8cf4dfff0a4d-kube-api-access-7rwgz\") pod \"manila-operator-controller-manager-5b5fd79c9c-dxc2l\" (UID: \"8b171fb8-d84f-4b45-bbd9-8cf4dfff0a4d\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-dxc2l" Dec 09 11:49:45 crc kubenswrapper[4745]: I1209 11:49:45.997269 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n5nk\" (UniqueName: \"kubernetes.io/projected/641a4ff2-1fa2-4402-a313-d27dcc9c4294-kube-api-access-2n5nk\") pod \"keystone-operator-controller-manager-7765d96ddf-cxl42\" (UID: \"641a4ff2-1fa2-4402-a313-d27dcc9c4294\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:45.998277 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-662j9"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.006054 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfzxm\" (UniqueName: \"kubernetes.io/projected/6368805c-1aab-4425-b599-671f89f30110-kube-api-access-vfzxm\") pod \"infra-operator-controller-manager-78d48bff9d-d2ksl\" (UID: \"6368805c-1aab-4425-b599-671f89f30110\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.007782 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n5vp\" (UniqueName: \"kubernetes.io/projected/62cbd8ba-a7e1-40ed-9e2c-8553a84ea6aa-kube-api-access-6n5vp\") pod \"ironic-operator-controller-manager-967d97867-mmpfw\" (UID: \"62cbd8ba-a7e1-40ed-9e2c-8553a84ea6aa\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-mmpfw" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.008435 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.013742 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.015817 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.016139 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-wpzn6" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.017396 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.023014 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-7hrrl" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.023040 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.032010 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.041707 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-mmpfw" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.064008 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.077228 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdw5t\" (UniqueName: \"kubernetes.io/projected/3e400982-edc3-462a-8917-5857ff6dd61e-kube-api-access-wdw5t\") pod \"octavia-operator-controller-manager-998648c74-2tl6s\" (UID: \"3e400982-edc3-462a-8917-5857ff6dd61e\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.077279 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45h5b\" (UniqueName: \"kubernetes.io/projected/5dcc5255-fdaa-463a-960c-5bc89c469a25-kube-api-access-45h5b\") pod \"openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47\" (UID: \"5dcc5255-fdaa-463a-960c-5bc89c469a25\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.077326 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvrsv\" (UniqueName: \"kubernetes.io/projected/6354cdc2-296f-4479-a46e-9f2be37c4eef-kube-api-access-fvrsv\") pod \"nova-operator-controller-manager-697bc559fc-662j9\" (UID: \"6354cdc2-296f-4479-a46e-9f2be37c4eef\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-662j9" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.077356 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvn6k\" (UniqueName: \"kubernetes.io/projected/336faca3-d04c-4ab5-b5dd-d9f031c80c64-kube-api-access-cvn6k\") pod \"mariadb-operator-controller-manager-79c8c4686c-sch7g\" (UID: \"336faca3-d04c-4ab5-b5dd-d9f031c80c64\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-sch7g" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.077384 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert\") pod \"openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47\" (UID: \"5dcc5255-fdaa-463a-960c-5bc89c469a25\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.077408 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfldx\" (UniqueName: \"kubernetes.io/projected/cbb6789a-1426-4ea2-aa2b-76959271ffc2-kube-api-access-cfldx\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-85z5r\" (UID: \"cbb6789a-1426-4ea2-aa2b-76959271ffc2\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-85z5r" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.077844 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.083367 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-gwnbz" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.083629 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.088787 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5wtvn" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.097320 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvn6k\" (UniqueName: \"kubernetes.io/projected/336faca3-d04c-4ab5-b5dd-d9f031c80c64-kube-api-access-cvn6k\") pod \"mariadb-operator-controller-manager-79c8c4686c-sch7g\" (UID: \"336faca3-d04c-4ab5-b5dd-d9f031c80c64\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-sch7g" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.118304 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvrsv\" (UniqueName: \"kubernetes.io/projected/6354cdc2-296f-4479-a46e-9f2be37c4eef-kube-api-access-fvrsv\") pod \"nova-operator-controller-manager-697bc559fc-662j9\" (UID: \"6354cdc2-296f-4479-a46e-9f2be37c4eef\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-662j9" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.118382 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.120019 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfldx\" (UniqueName: \"kubernetes.io/projected/cbb6789a-1426-4ea2-aa2b-76959271ffc2-kube-api-access-cfldx\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-85z5r\" (UID: \"cbb6789a-1426-4ea2-aa2b-76959271ffc2\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-85z5r" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.120198 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.122733 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-8h2wf" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.170546 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.171984 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-f582k"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.172765 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-f582k" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.173349 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.188600 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-97hdj" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.188872 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-l6m98" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.190624 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdw5t\" (UniqueName: \"kubernetes.io/projected/3e400982-edc3-462a-8917-5857ff6dd61e-kube-api-access-wdw5t\") pod \"octavia-operator-controller-manager-998648c74-2tl6s\" (UID: \"3e400982-edc3-462a-8917-5857ff6dd61e\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.190684 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45h5b\" (UniqueName: \"kubernetes.io/projected/5dcc5255-fdaa-463a-960c-5bc89c469a25-kube-api-access-45h5b\") pod \"openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47\" (UID: \"5dcc5255-fdaa-463a-960c-5bc89c469a25\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.190794 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.191751 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert\") pod \"openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47\" (UID: \"5dcc5255-fdaa-463a-960c-5bc89c469a25\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" Dec 09 11:49:46 crc kubenswrapper[4745]: E1209 11:49:46.192040 4745 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 11:49:46 crc kubenswrapper[4745]: E1209 11:49:46.192085 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert podName:5dcc5255-fdaa-463a-960c-5bc89c469a25 nodeName:}" failed. No retries permitted until 2025-12-09 11:49:46.692072092 +0000 UTC m=+1073.517273606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert") pod "openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" (UID: "5dcc5255-fdaa-463a-960c-5bc89c469a25") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.192348 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.196345 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-np88r"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.197478 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-np88r" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.205734 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.206652 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-mbrfj" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.206978 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-7p6fs" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.217249 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-f582k"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.217304 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.229538 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-np88r"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.240473 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-4q2bv"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.241247 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45h5b\" (UniqueName: \"kubernetes.io/projected/5dcc5255-fdaa-463a-960c-5bc89c469a25-kube-api-access-45h5b\") pod \"openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47\" (UID: \"5dcc5255-fdaa-463a-960c-5bc89c469a25\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.241705 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-4q2bv" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.246896 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-4q2bv"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.251364 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-r8t8g" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.287764 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-dxc2l" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.294974 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v8j4\" (UniqueName: \"kubernetes.io/projected/8221e85a-690c-4344-9bd7-5adc0e40b513-kube-api-access-4v8j4\") pod \"swift-operator-controller-manager-9d58d64bc-dfwwk\" (UID: \"8221e85a-690c-4344-9bd7-5adc0e40b513\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.295036 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4rjr\" (UniqueName: \"kubernetes.io/projected/e98864fd-e07e-4f9f-a9c1-b55c69a26922-kube-api-access-l4rjr\") pod \"watcher-operator-controller-manager-667bd8d554-4q2bv\" (UID: \"e98864fd-e07e-4f9f-a9c1-b55c69a26922\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-4q2bv" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.295057 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nwl9\" (UniqueName: \"kubernetes.io/projected/97f9d87f-9f72-452a-ba66-29c916324b43-kube-api-access-7nwl9\") pod \"placement-operator-controller-manager-78f8948974-7jfkb\" (UID: \"97f9d87f-9f72-452a-ba66-29c916324b43\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.295080 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nnsb\" (UniqueName: \"kubernetes.io/projected/6267400f-bf32-4ce3-b875-831cee43fc17-kube-api-access-6nnsb\") pod \"telemetry-operator-controller-manager-58d5ff84df-f582k\" (UID: \"6267400f-bf32-4ce3-b875-831cee43fc17\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-f582k" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.295114 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dvj5\" (UniqueName: \"kubernetes.io/projected/23db7946-4921-4ac3-aea8-55abc4c4ba1c-kube-api-access-6dvj5\") pod \"ovn-operator-controller-manager-b6456fdb6-jw6tj\" (UID: \"23db7946-4921-4ac3-aea8-55abc4c4ba1c\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.295131 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpjnr\" (UniqueName: \"kubernetes.io/projected/40fc90a5-6278-4b54-8260-728265a2501a-kube-api-access-gpjnr\") pod \"test-operator-controller-manager-5854674fcc-np88r\" (UID: \"40fc90a5-6278-4b54-8260-728265a2501a\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-np88r" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.319226 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-sch7g" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.732622 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-662j9" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.735408 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-85z5r" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.739471 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdw5t\" (UniqueName: \"kubernetes.io/projected/3e400982-edc3-462a-8917-5857ff6dd61e-kube-api-access-wdw5t\") pod \"octavia-operator-controller-manager-998648c74-2tl6s\" (UID: \"3e400982-edc3-462a-8917-5857ff6dd61e\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.759662 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.763827 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.790232 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.790561 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.790754 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wmvqt" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.791651 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.799896 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v8j4\" (UniqueName: \"kubernetes.io/projected/8221e85a-690c-4344-9bd7-5adc0e40b513-kube-api-access-4v8j4\") pod \"swift-operator-controller-manager-9d58d64bc-dfwwk\" (UID: \"8221e85a-690c-4344-9bd7-5adc0e40b513\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.799994 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert\") pod \"openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47\" (UID: \"5dcc5255-fdaa-463a-960c-5bc89c469a25\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.800019 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4rjr\" (UniqueName: \"kubernetes.io/projected/e98864fd-e07e-4f9f-a9c1-b55c69a26922-kube-api-access-l4rjr\") pod \"watcher-operator-controller-manager-667bd8d554-4q2bv\" (UID: \"e98864fd-e07e-4f9f-a9c1-b55c69a26922\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-4q2bv" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.800053 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nwl9\" (UniqueName: \"kubernetes.io/projected/97f9d87f-9f72-452a-ba66-29c916324b43-kube-api-access-7nwl9\") pod \"placement-operator-controller-manager-78f8948974-7jfkb\" (UID: \"97f9d87f-9f72-452a-ba66-29c916324b43\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" Dec 09 11:49:46 crc kubenswrapper[4745]: E1209 11:49:46.800440 4745 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 11:49:46 crc kubenswrapper[4745]: E1209 11:49:46.800628 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert podName:5dcc5255-fdaa-463a-960c-5bc89c469a25 nodeName:}" failed. No retries permitted until 2025-12-09 11:49:47.800598138 +0000 UTC m=+1074.625799662 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert") pod "openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" (UID: "5dcc5255-fdaa-463a-960c-5bc89c469a25") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.800688 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nnsb\" (UniqueName: \"kubernetes.io/projected/6267400f-bf32-4ce3-b875-831cee43fc17-kube-api-access-6nnsb\") pod \"telemetry-operator-controller-manager-58d5ff84df-f582k\" (UID: \"6267400f-bf32-4ce3-b875-831cee43fc17\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-f582k" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.800747 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert\") pod \"infra-operator-controller-manager-78d48bff9d-d2ksl\" (UID: \"6368805c-1aab-4425-b599-671f89f30110\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.800779 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dvj5\" (UniqueName: \"kubernetes.io/projected/23db7946-4921-4ac3-aea8-55abc4c4ba1c-kube-api-access-6dvj5\") pod \"ovn-operator-controller-manager-b6456fdb6-jw6tj\" (UID: \"23db7946-4921-4ac3-aea8-55abc4c4ba1c\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.800831 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpjnr\" (UniqueName: \"kubernetes.io/projected/40fc90a5-6278-4b54-8260-728265a2501a-kube-api-access-gpjnr\") pod \"test-operator-controller-manager-5854674fcc-np88r\" (UID: \"40fc90a5-6278-4b54-8260-728265a2501a\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-np88r" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.801137 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s" Dec 09 11:49:46 crc kubenswrapper[4745]: E1209 11:49:46.801273 4745 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 11:49:46 crc kubenswrapper[4745]: E1209 11:49:46.801320 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert podName:6368805c-1aab-4425-b599-671f89f30110 nodeName:}" failed. No retries permitted until 2025-12-09 11:49:47.801303987 +0000 UTC m=+1074.626505511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert") pod "infra-operator-controller-manager-78d48bff9d-d2ksl" (UID: "6368805c-1aab-4425-b599-671f89f30110") : secret "infra-operator-webhook-server-cert" not found Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.835855 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nnsb\" (UniqueName: \"kubernetes.io/projected/6267400f-bf32-4ce3-b875-831cee43fc17-kube-api-access-6nnsb\") pod \"telemetry-operator-controller-manager-58d5ff84df-f582k\" (UID: \"6267400f-bf32-4ce3-b875-831cee43fc17\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-f582k" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.841028 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dvj5\" (UniqueName: \"kubernetes.io/projected/23db7946-4921-4ac3-aea8-55abc4c4ba1c-kube-api-access-6dvj5\") pod \"ovn-operator-controller-manager-b6456fdb6-jw6tj\" (UID: \"23db7946-4921-4ac3-aea8-55abc4c4ba1c\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.847816 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v8j4\" (UniqueName: \"kubernetes.io/projected/8221e85a-690c-4344-9bd7-5adc0e40b513-kube-api-access-4v8j4\") pod \"swift-operator-controller-manager-9d58d64bc-dfwwk\" (UID: \"8221e85a-690c-4344-9bd7-5adc0e40b513\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.849760 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpjnr\" (UniqueName: \"kubernetes.io/projected/40fc90a5-6278-4b54-8260-728265a2501a-kube-api-access-gpjnr\") pod \"test-operator-controller-manager-5854674fcc-np88r\" (UID: \"40fc90a5-6278-4b54-8260-728265a2501a\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-np88r" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.864862 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4rjr\" (UniqueName: \"kubernetes.io/projected/e98864fd-e07e-4f9f-a9c1-b55c69a26922-kube-api-access-l4rjr\") pod \"watcher-operator-controller-manager-667bd8d554-4q2bv\" (UID: \"e98864fd-e07e-4f9f-a9c1-b55c69a26922\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-4q2bv" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.867413 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nwl9\" (UniqueName: \"kubernetes.io/projected/97f9d87f-9f72-452a-ba66-29c916324b43-kube-api-access-7nwl9\") pod \"placement-operator-controller-manager-78f8948974-7jfkb\" (UID: \"97f9d87f-9f72-452a-ba66-29c916324b43\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.867572 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-klghd"] Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.869166 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-klghd" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.879371 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-79vwh" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.884139 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" Dec 09 11:49:46 crc kubenswrapper[4745]: I1209 11:49:46.894201 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-klghd"] Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:46.955185 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-f582k" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.039045 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.040220 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8r88\" (UniqueName: \"kubernetes.io/projected/cbee1192-ad85-44db-9457-df5de60b7047-kube-api-access-x8r88\") pod \"rabbitmq-cluster-operator-manager-668c99d594-klghd\" (UID: \"cbee1192-ad85-44db-9457-df5de60b7047\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-klghd" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.040330 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnrvk\" (UniqueName: \"kubernetes.io/projected/4fb78653-78a7-4840-8bd4-1ac08145a845-kube-api-access-mnrvk\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.040407 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.040446 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.093562 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-np88r" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.108477 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.129874 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-4q2bv" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.142420 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8r88\" (UniqueName: \"kubernetes.io/projected/cbee1192-ad85-44db-9457-df5de60b7047-kube-api-access-x8r88\") pod \"rabbitmq-cluster-operator-manager-668c99d594-klghd\" (UID: \"cbee1192-ad85-44db-9457-df5de60b7047\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-klghd" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.142485 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnrvk\" (UniqueName: \"kubernetes.io/projected/4fb78653-78a7-4840-8bd4-1ac08145a845-kube-api-access-mnrvk\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.142569 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.142594 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:49:47 crc kubenswrapper[4745]: E1209 11:49:47.142819 4745 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 11:49:47 crc kubenswrapper[4745]: E1209 11:49:47.142884 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs podName:4fb78653-78a7-4840-8bd4-1ac08145a845 nodeName:}" failed. No retries permitted until 2025-12-09 11:49:47.642866193 +0000 UTC m=+1074.468067717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs") pod "openstack-operator-controller-manager-668858c49-7ffqt" (UID: "4fb78653-78a7-4840-8bd4-1ac08145a845") : secret "webhook-server-cert" not found Dec 09 11:49:47 crc kubenswrapper[4745]: E1209 11:49:47.142958 4745 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 09 11:49:47 crc kubenswrapper[4745]: E1209 11:49:47.143015 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs podName:4fb78653-78a7-4840-8bd4-1ac08145a845 nodeName:}" failed. No retries permitted until 2025-12-09 11:49:47.643001106 +0000 UTC m=+1074.468202630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs") pod "openstack-operator-controller-manager-668858c49-7ffqt" (UID: "4fb78653-78a7-4840-8bd4-1ac08145a845") : secret "metrics-server-cert" not found Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.170226 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8r88\" (UniqueName: \"kubernetes.io/projected/cbee1192-ad85-44db-9457-df5de60b7047-kube-api-access-x8r88\") pod \"rabbitmq-cluster-operator-manager-668c99d594-klghd\" (UID: \"cbee1192-ad85-44db-9457-df5de60b7047\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-klghd" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.172619 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-klghd" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.184405 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnrvk\" (UniqueName: \"kubernetes.io/projected/4fb78653-78a7-4840-8bd4-1ac08145a845-kube-api-access-mnrvk\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.675708 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.675744 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:49:47 crc kubenswrapper[4745]: E1209 11:49:47.675962 4745 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 11:49:47 crc kubenswrapper[4745]: E1209 11:49:47.676022 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs podName:4fb78653-78a7-4840-8bd4-1ac08145a845 nodeName:}" failed. No retries permitted until 2025-12-09 11:49:48.676005028 +0000 UTC m=+1075.501206552 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs") pod "openstack-operator-controller-manager-668858c49-7ffqt" (UID: "4fb78653-78a7-4840-8bd4-1ac08145a845") : secret "webhook-server-cert" not found Dec 09 11:49:47 crc kubenswrapper[4745]: E1209 11:49:47.676468 4745 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 09 11:49:47 crc kubenswrapper[4745]: E1209 11:49:47.676520 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs podName:4fb78653-78a7-4840-8bd4-1ac08145a845 nodeName:}" failed. No retries permitted until 2025-12-09 11:49:48.676496291 +0000 UTC m=+1075.501697815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs") pod "openstack-operator-controller-manager-668858c49-7ffqt" (UID: "4fb78653-78a7-4840-8bd4-1ac08145a845") : secret "metrics-server-cert" not found Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.879031 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert\") pod \"openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47\" (UID: \"5dcc5255-fdaa-463a-960c-5bc89c469a25\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" Dec 09 11:49:47 crc kubenswrapper[4745]: I1209 11:49:47.879442 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert\") pod \"infra-operator-controller-manager-78d48bff9d-d2ksl\" (UID: \"6368805c-1aab-4425-b599-671f89f30110\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" Dec 09 11:49:47 crc kubenswrapper[4745]: E1209 11:49:47.880180 4745 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 11:49:47 crc kubenswrapper[4745]: E1209 11:49:47.880268 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert podName:6368805c-1aab-4425-b599-671f89f30110 nodeName:}" failed. No retries permitted until 2025-12-09 11:49:49.880240188 +0000 UTC m=+1076.705441712 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert") pod "infra-operator-controller-manager-78d48bff9d-d2ksl" (UID: "6368805c-1aab-4425-b599-671f89f30110") : secret "infra-operator-webhook-server-cert" not found Dec 09 11:49:47 crc kubenswrapper[4745]: E1209 11:49:47.880609 4745 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 11:49:47 crc kubenswrapper[4745]: E1209 11:49:47.880689 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert podName:5dcc5255-fdaa-463a-960c-5bc89c469a25 nodeName:}" failed. No retries permitted until 2025-12-09 11:49:49.880666759 +0000 UTC m=+1076.705868283 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert") pod "openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" (UID: "5dcc5255-fdaa-463a-960c-5bc89c469a25") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 11:49:48 crc kubenswrapper[4745]: I1209 11:49:48.572767 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-bngxs"] Dec 09 11:49:48 crc kubenswrapper[4745]: I1209 11:49:48.576682 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-6vwcq"] Dec 09 11:49:48 crc kubenswrapper[4745]: I1209 11:49:48.700044 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:49:48 crc kubenswrapper[4745]: I1209 11:49:48.700101 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:49:48 crc kubenswrapper[4745]: E1209 11:49:48.700233 4745 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 11:49:48 crc kubenswrapper[4745]: E1209 11:49:48.700280 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs podName:4fb78653-78a7-4840-8bd4-1ac08145a845 nodeName:}" failed. No retries permitted until 2025-12-09 11:49:50.700267628 +0000 UTC m=+1077.525469152 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs") pod "openstack-operator-controller-manager-668858c49-7ffqt" (UID: "4fb78653-78a7-4840-8bd4-1ac08145a845") : secret "webhook-server-cert" not found Dec 09 11:49:48 crc kubenswrapper[4745]: E1209 11:49:48.700602 4745 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 09 11:49:48 crc kubenswrapper[4745]: E1209 11:49:48.700665 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs podName:4fb78653-78a7-4840-8bd4-1ac08145a845 nodeName:}" failed. No retries permitted until 2025-12-09 11:49:50.700650878 +0000 UTC m=+1077.525852402 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs") pod "openstack-operator-controller-manager-668858c49-7ffqt" (UID: "4fb78653-78a7-4840-8bd4-1ac08145a845") : secret "metrics-server-cert" not found Dec 09 11:49:48 crc kubenswrapper[4745]: W1209 11:49:48.746034 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod668cdfd0_6b54_4750_b58f_97b26180b203.slice/crio-4cfd456ba247215164a69b7608dbf9a6ea2b4d0135e7656e527f61aea09c50d4 WatchSource:0}: Error finding container 4cfd456ba247215164a69b7608dbf9a6ea2b4d0135e7656e527f61aea09c50d4: Status 404 returned error can't find the container with id 4cfd456ba247215164a69b7608dbf9a6ea2b4d0135e7656e527f61aea09c50d4 Dec 09 11:49:48 crc kubenswrapper[4745]: I1209 11:49:48.765179 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-mmpfw"] Dec 09 11:49:48 crc kubenswrapper[4745]: I1209 11:49:48.806061 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-8h2wf"] Dec 09 11:49:48 crc kubenswrapper[4745]: I1209 11:49:48.971453 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-6vwcq" event={"ID":"42df7b5d-8be1-4670-b063-e83cf65b1dae","Type":"ContainerStarted","Data":"21e5f60f241c1b7fd9864f5af7f7526a566cdee1cb70ffe1d45929cb57d43cdc"} Dec 09 11:49:48 crc kubenswrapper[4745]: I1209 11:49:48.981670 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-8h2wf" event={"ID":"ae7706be-4789-4153-adf4-9abb8e4ee8d8","Type":"ContainerStarted","Data":"e395215d7a4d2b40de86542987fbad0d464326cdafab0e85e3e4781d91849ec8"} Dec 09 11:49:48 crc kubenswrapper[4745]: I1209 11:49:48.982551 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-mmpfw" event={"ID":"62cbd8ba-a7e1-40ed-9e2c-8553a84ea6aa","Type":"ContainerStarted","Data":"01f5f8017c03969ca47703f67887cd732be5aed93b984e3454896c4805cf7992"} Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.000384 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-bngxs" event={"ID":"668cdfd0-6b54-4750-b58f-97b26180b203","Type":"ContainerStarted","Data":"4cfd456ba247215164a69b7608dbf9a6ea2b4d0135e7656e527f61aea09c50d4"} Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.324660 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-5wtvn"] Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.478722 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-dxc2l"] Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.510401 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls"] Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.543219 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-sch7g"] Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.738669 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-662j9"] Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.739025 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ljgwk"] Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.776246 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-85z5r"] Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.783029 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42"] Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.783889 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.798232 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-np88r"] Dec 09 11:49:49 crc kubenswrapper[4745]: W1209 11:49:49.805709 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40fc90a5_6278_4b54_8260_728265a2501a.slice/crio-92130e945df567eed2cefbed44759ac9b5b87617c8a28e91f1732fda3310856a WatchSource:0}: Error finding container 92130e945df567eed2cefbed44759ac9b5b87617c8a28e91f1732fda3310856a: Status 404 returned error can't find the container with id 92130e945df567eed2cefbed44759ac9b5b87617c8a28e91f1732fda3310856a Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.820458 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb"] Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.836704 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-klghd"] Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.854683 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s"] Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.858276 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj"] Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.867586 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk"] Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.871335 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-4q2bv"] Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.890843 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert\") pod \"openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47\" (UID: \"5dcc5255-fdaa-463a-960c-5bc89c469a25\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.890895 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert\") pod \"infra-operator-controller-manager-78d48bff9d-d2ksl\" (UID: \"6368805c-1aab-4425-b599-671f89f30110\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" Dec 09 11:49:49 crc kubenswrapper[4745]: E1209 11:49:49.891006 4745 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 11:49:49 crc kubenswrapper[4745]: E1209 11:49:49.891045 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert podName:6368805c-1aab-4425-b599-671f89f30110 nodeName:}" failed. No retries permitted until 2025-12-09 11:49:53.891030642 +0000 UTC m=+1080.716232166 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert") pod "infra-operator-controller-manager-78d48bff9d-d2ksl" (UID: "6368805c-1aab-4425-b599-671f89f30110") : secret "infra-operator-webhook-server-cert" not found Dec 09 11:49:49 crc kubenswrapper[4745]: E1209 11:49:49.891087 4745 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 11:49:49 crc kubenswrapper[4745]: E1209 11:49:49.891104 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert podName:5dcc5255-fdaa-463a-960c-5bc89c469a25 nodeName:}" failed. No retries permitted until 2025-12-09 11:49:53.891098084 +0000 UTC m=+1080.716299608 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert") pod "openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" (UID: "5dcc5255-fdaa-463a-960c-5bc89c469a25") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 11:49:49 crc kubenswrapper[4745]: I1209 11:49:49.956144 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-f582k"] Dec 09 11:49:49 crc kubenswrapper[4745]: E1209 11:49:49.984320 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6dvj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-jw6tj_openstack-operators(23db7946-4921-4ac3-aea8-55abc4c4ba1c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 11:49:49 crc kubenswrapper[4745]: E1209 11:49:49.994374 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4v8j4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-dfwwk_openstack-operators(8221e85a-690c-4344-9bd7-5adc0e40b513): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 11:49:49 crc kubenswrapper[4745]: E1209 11:49:49.994560 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7nwl9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-7jfkb_openstack-operators(97f9d87f-9f72-452a-ba66-29c916324b43): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 11:49:49 crc kubenswrapper[4745]: E1209 11:49:49.998732 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6dvj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-jw6tj_openstack-operators(23db7946-4921-4ac3-aea8-55abc4c4ba1c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 11:49:50 crc kubenswrapper[4745]: E1209 11:49:50.000631 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" podUID="23db7946-4921-4ac3-aea8-55abc4c4ba1c" Dec 09 11:49:50 crc kubenswrapper[4745]: E1209 11:49:50.015290 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6nnsb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-f582k_openstack-operators(6267400f-bf32-4ce3-b875-831cee43fc17): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 11:49:50 crc kubenswrapper[4745]: E1209 11:49:50.015437 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7nwl9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-7jfkb_openstack-operators(97f9d87f-9f72-452a-ba66-29c916324b43): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 11:49:50 crc kubenswrapper[4745]: E1209 11:49:50.015583 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4v8j4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-dfwwk_openstack-operators(8221e85a-690c-4344-9bd7-5adc0e40b513): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 11:49:50 crc kubenswrapper[4745]: E1209 11:49:50.017006 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk" podUID="8221e85a-690c-4344-9bd7-5adc0e40b513" Dec 09 11:49:50 crc kubenswrapper[4745]: E1209 11:49:50.017177 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" podUID="97f9d87f-9f72-452a-ba66-29c916324b43" Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.046411 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42" event={"ID":"641a4ff2-1fa2-4402-a313-d27dcc9c4294","Type":"ContainerStarted","Data":"6e111f3e63bd6d54bd29191c4481abc1fa4bcf7274238a003c6d6bfa38fc8391"} Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.058792 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-662j9" event={"ID":"6354cdc2-296f-4479-a46e-9f2be37c4eef","Type":"ContainerStarted","Data":"e7e65f85f848b272a7cccf62f5f871703a9adaad3959abefa869f816c74ebe3f"} Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.120515 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5wtvn" event={"ID":"a8ec2c42-ff99-42c1-b0bb-e362207f4e3e","Type":"ContainerStarted","Data":"b94d6762fbb27339787ccd89a00f41b0541ecb409b85ab3bb4fef491a13b5171"} Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.143132 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-klghd" event={"ID":"cbee1192-ad85-44db-9457-df5de60b7047","Type":"ContainerStarted","Data":"23b3aa348b4c61ee3048b3e4ed033aeab3e667027bed5fb7ef0f1708ec0720ef"} Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.168944 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk" event={"ID":"8221e85a-690c-4344-9bd7-5adc0e40b513","Type":"ContainerStarted","Data":"9c3d114a162283318155267ca65096058de924f2ce3566adbaac4cb10fc70224"} Dec 09 11:49:50 crc kubenswrapper[4745]: E1209 11:49:50.180401 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk" podUID="8221e85a-690c-4344-9bd7-5adc0e40b513" Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.181986 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls" event={"ID":"faa047d4-ea12-493a-aa0a-b429e0c5a123","Type":"ContainerStarted","Data":"8368b37c8a38ee349a038fbde23ff380993864baa17a461c6f228a69509e22db"} Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.183800 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-np88r" event={"ID":"40fc90a5-6278-4b54-8260-728265a2501a","Type":"ContainerStarted","Data":"92130e945df567eed2cefbed44759ac9b5b87617c8a28e91f1732fda3310856a"} Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.185877 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-sch7g" event={"ID":"336faca3-d04c-4ab5-b5dd-d9f031c80c64","Type":"ContainerStarted","Data":"45b47953fba62869b12089be63a6a451dcd2152cb71e52d54ad7f5de8e570cd1"} Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.191273 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s" event={"ID":"3e400982-edc3-462a-8917-5857ff6dd61e","Type":"ContainerStarted","Data":"80feebed77f392bdfdf6dddb046d15a8cff098f99255e050bf1ef5d63329a476"} Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.194650 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-f582k" event={"ID":"6267400f-bf32-4ce3-b875-831cee43fc17","Type":"ContainerStarted","Data":"f68a139f6316f59f174bf8aa932848f70b56a4bf21405291066de71bd48cebb1"} Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.208650 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" event={"ID":"23db7946-4921-4ac3-aea8-55abc4c4ba1c","Type":"ContainerStarted","Data":"97379bc960671f1311e5c8270afff10a8b7fac549a9fd863591270de57125179"} Dec 09 11:49:50 crc kubenswrapper[4745]: E1209 11:49:50.212675 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" podUID="23db7946-4921-4ac3-aea8-55abc4c4ba1c" Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.215276 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ljgwk" event={"ID":"b026a92d-e1be-43c9-8e7a-1f66260bab18","Type":"ContainerStarted","Data":"0546120ea88889ee3c71ffafd51fbbf9bef6b5fe825f1ef5edc8b9174aa085a9"} Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.236937 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-dxc2l" event={"ID":"8b171fb8-d84f-4b45-bbd9-8cf4dfff0a4d","Type":"ContainerStarted","Data":"e42df4f901580f34fe08e19d3f7398abfd3a5e61e6fe29aeb189bb80fe02433c"} Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.244753 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" event={"ID":"97f9d87f-9f72-452a-ba66-29c916324b43","Type":"ContainerStarted","Data":"cc2ac68db9f44da77d8bda9b0ed28b87da2790b52499098d08b519ffc690864b"} Dec 09 11:49:50 crc kubenswrapper[4745]: E1209 11:49:50.255169 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" podUID="97f9d87f-9f72-452a-ba66-29c916324b43" Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.264746 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-85z5r" event={"ID":"cbb6789a-1426-4ea2-aa2b-76959271ffc2","Type":"ContainerStarted","Data":"a258c12889e1f02d58462c3144fbd0082e450d072f767b4f089a378b68b83107"} Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.274316 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-4q2bv" event={"ID":"e98864fd-e07e-4f9f-a9c1-b55c69a26922","Type":"ContainerStarted","Data":"ad0c140c338242c93cdc97b498991c6e5e7f98149e605637cd2413453dbd26de"} Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.706719 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:49:50 crc kubenswrapper[4745]: I1209 11:49:50.706759 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:49:50 crc kubenswrapper[4745]: E1209 11:49:50.706910 4745 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 11:49:50 crc kubenswrapper[4745]: E1209 11:49:50.706961 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs podName:4fb78653-78a7-4840-8bd4-1ac08145a845 nodeName:}" failed. No retries permitted until 2025-12-09 11:49:54.706944401 +0000 UTC m=+1081.532145925 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs") pod "openstack-operator-controller-manager-668858c49-7ffqt" (UID: "4fb78653-78a7-4840-8bd4-1ac08145a845") : secret "webhook-server-cert" not found Dec 09 11:49:50 crc kubenswrapper[4745]: E1209 11:49:50.707008 4745 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 09 11:49:50 crc kubenswrapper[4745]: E1209 11:49:50.707033 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs podName:4fb78653-78a7-4840-8bd4-1ac08145a845 nodeName:}" failed. No retries permitted until 2025-12-09 11:49:54.707021683 +0000 UTC m=+1081.532223207 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs") pod "openstack-operator-controller-manager-668858c49-7ffqt" (UID: "4fb78653-78a7-4840-8bd4-1ac08145a845") : secret "metrics-server-cert" not found Dec 09 11:49:51 crc kubenswrapper[4745]: E1209 11:49:51.353684 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" podUID="23db7946-4921-4ac3-aea8-55abc4c4ba1c" Dec 09 11:49:51 crc kubenswrapper[4745]: E1209 11:49:51.358681 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" podUID="97f9d87f-9f72-452a-ba66-29c916324b43" Dec 09 11:49:51 crc kubenswrapper[4745]: E1209 11:49:51.358829 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk" podUID="8221e85a-690c-4344-9bd7-5adc0e40b513" Dec 09 11:49:53 crc kubenswrapper[4745]: I1209 11:49:53.902461 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert\") pod \"openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47\" (UID: \"5dcc5255-fdaa-463a-960c-5bc89c469a25\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" Dec 09 11:49:53 crc kubenswrapper[4745]: I1209 11:49:53.902849 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert\") pod \"infra-operator-controller-manager-78d48bff9d-d2ksl\" (UID: \"6368805c-1aab-4425-b599-671f89f30110\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" Dec 09 11:49:53 crc kubenswrapper[4745]: E1209 11:49:53.903277 4745 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 11:49:53 crc kubenswrapper[4745]: E1209 11:49:53.903349 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert podName:6368805c-1aab-4425-b599-671f89f30110 nodeName:}" failed. No retries permitted until 2025-12-09 11:50:01.90332842 +0000 UTC m=+1088.728529944 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert") pod "infra-operator-controller-manager-78d48bff9d-d2ksl" (UID: "6368805c-1aab-4425-b599-671f89f30110") : secret "infra-operator-webhook-server-cert" not found Dec 09 11:49:53 crc kubenswrapper[4745]: E1209 11:49:53.904012 4745 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 11:49:53 crc kubenswrapper[4745]: E1209 11:49:53.904058 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert podName:5dcc5255-fdaa-463a-960c-5bc89c469a25 nodeName:}" failed. No retries permitted until 2025-12-09 11:50:01.904032619 +0000 UTC m=+1088.729234143 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert") pod "openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" (UID: "5dcc5255-fdaa-463a-960c-5bc89c469a25") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 11:49:54 crc kubenswrapper[4745]: I1209 11:49:54.760768 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:49:54 crc kubenswrapper[4745]: I1209 11:49:54.761170 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:49:54 crc kubenswrapper[4745]: E1209 11:49:54.760998 4745 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 09 11:49:54 crc kubenswrapper[4745]: E1209 11:49:54.761286 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs podName:4fb78653-78a7-4840-8bd4-1ac08145a845 nodeName:}" failed. No retries permitted until 2025-12-09 11:50:02.761266991 +0000 UTC m=+1089.586468515 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs") pod "openstack-operator-controller-manager-668858c49-7ffqt" (UID: "4fb78653-78a7-4840-8bd4-1ac08145a845") : secret "metrics-server-cert" not found Dec 09 11:49:54 crc kubenswrapper[4745]: E1209 11:49:54.761352 4745 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 11:49:54 crc kubenswrapper[4745]: E1209 11:49:54.761559 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs podName:4fb78653-78a7-4840-8bd4-1ac08145a845 nodeName:}" failed. No retries permitted until 2025-12-09 11:50:02.761542558 +0000 UTC m=+1089.586744082 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs") pod "openstack-operator-controller-manager-668858c49-7ffqt" (UID: "4fb78653-78a7-4840-8bd4-1ac08145a845") : secret "webhook-server-cert" not found Dec 09 11:49:55 crc kubenswrapper[4745]: I1209 11:49:55.476029 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:49:55 crc kubenswrapper[4745]: I1209 11:49:55.476126 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:50:01 crc kubenswrapper[4745]: I1209 11:50:01.969117 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert\") pod \"openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47\" (UID: \"5dcc5255-fdaa-463a-960c-5bc89c469a25\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" Dec 09 11:50:01 crc kubenswrapper[4745]: I1209 11:50:01.969676 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert\") pod \"infra-operator-controller-manager-78d48bff9d-d2ksl\" (UID: \"6368805c-1aab-4425-b599-671f89f30110\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" Dec 09 11:50:01 crc kubenswrapper[4745]: E1209 11:50:01.969286 4745 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 11:50:01 crc kubenswrapper[4745]: E1209 11:50:01.969816 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert podName:5dcc5255-fdaa-463a-960c-5bc89c469a25 nodeName:}" failed. No retries permitted until 2025-12-09 11:50:17.969793332 +0000 UTC m=+1104.794994856 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert") pod "openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" (UID: "5dcc5255-fdaa-463a-960c-5bc89c469a25") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 11:50:01 crc kubenswrapper[4745]: E1209 11:50:01.969829 4745 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 11:50:01 crc kubenswrapper[4745]: E1209 11:50:01.969887 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert podName:6368805c-1aab-4425-b599-671f89f30110 nodeName:}" failed. No retries permitted until 2025-12-09 11:50:17.969867564 +0000 UTC m=+1104.795069088 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert") pod "infra-operator-controller-manager-78d48bff9d-d2ksl" (UID: "6368805c-1aab-4425-b599-671f89f30110") : secret "infra-operator-webhook-server-cert" not found Dec 09 11:50:02 crc kubenswrapper[4745]: I1209 11:50:02.783246 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:50:02 crc kubenswrapper[4745]: I1209 11:50:02.783311 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:50:02 crc kubenswrapper[4745]: I1209 11:50:02.793596 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-webhook-certs\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:50:02 crc kubenswrapper[4745]: I1209 11:50:02.798252 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fb78653-78a7-4840-8bd4-1ac08145a845-metrics-certs\") pod \"openstack-operator-controller-manager-668858c49-7ffqt\" (UID: \"4fb78653-78a7-4840-8bd4-1ac08145a845\") " pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:50:03 crc kubenswrapper[4745]: I1209 11:50:03.059438 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wmvqt" Dec 09 11:50:03 crc kubenswrapper[4745]: I1209 11:50:03.068720 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:50:18 crc kubenswrapper[4745]: I1209 11:50:18.037473 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert\") pod \"openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47\" (UID: \"5dcc5255-fdaa-463a-960c-5bc89c469a25\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" Dec 09 11:50:18 crc kubenswrapper[4745]: I1209 11:50:18.038332 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert\") pod \"infra-operator-controller-manager-78d48bff9d-d2ksl\" (UID: \"6368805c-1aab-4425-b599-671f89f30110\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" Dec 09 11:50:18 crc kubenswrapper[4745]: I1209 11:50:18.045358 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6368805c-1aab-4425-b599-671f89f30110-cert\") pod \"infra-operator-controller-manager-78d48bff9d-d2ksl\" (UID: \"6368805c-1aab-4425-b599-671f89f30110\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" Dec 09 11:50:18 crc kubenswrapper[4745]: I1209 11:50:18.045646 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dcc5255-fdaa-463a-960c-5bc89c469a25-cert\") pod \"openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47\" (UID: \"5dcc5255-fdaa-463a-960c-5bc89c469a25\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" Dec 09 11:50:18 crc kubenswrapper[4745]: I1209 11:50:18.108148 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-j49xb" Dec 09 11:50:18 crc kubenswrapper[4745]: I1209 11:50:18.115825 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" Dec 09 11:50:18 crc kubenswrapper[4745]: E1209 11:50:18.277497 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 09 11:50:18 crc kubenswrapper[4745]: E1209 11:50:18.277856 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gpjnr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-np88r_openstack-operators(40fc90a5-6278-4b54-8260-728265a2501a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:18 crc kubenswrapper[4745]: I1209 11:50:18.331661 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-7hrrl" Dec 09 11:50:18 crc kubenswrapper[4745]: I1209 11:50:18.340072 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" Dec 09 11:50:20 crc kubenswrapper[4745]: E1209 11:50:20.822639 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 09 11:50:20 crc kubenswrapper[4745]: E1209 11:50:20.823166 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s8hzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-ljgwk_openstack-operators(b026a92d-e1be-43c9-8e7a-1f66260bab18): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:21 crc kubenswrapper[4745]: E1209 11:50:21.700628 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad" Dec 09 11:50:21 crc kubenswrapper[4745]: E1209 11:50:21.702843 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cvn6k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-79c8c4686c-sch7g_openstack-operators(336faca3-d04c-4ab5-b5dd-d9f031c80c64): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:22 crc kubenswrapper[4745]: E1209 11:50:22.401699 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87" Dec 09 11:50:22 crc kubenswrapper[4745]: E1209 11:50:22.401932 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6n5vp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-967d97867-mmpfw_openstack-operators(62cbd8ba-a7e1-40ed-9e2c-8553a84ea6aa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:25 crc kubenswrapper[4745]: E1209 11:50:25.045359 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 09 11:50:25 crc kubenswrapper[4745]: E1209 11:50:25.045884 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cfldx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-85z5r_openstack-operators(cbb6789a-1426-4ea2-aa2b-76959271ffc2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:25 crc kubenswrapper[4745]: I1209 11:50:25.481261 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:50:25 crc kubenswrapper[4745]: I1209 11:50:25.481363 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:50:30 crc kubenswrapper[4745]: E1209 11:50:30.204866 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a" Dec 09 11:50:30 crc kubenswrapper[4745]: E1209 11:50:30.205874 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vqg6c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-697fb699cf-6vwcq_openstack-operators(42df7b5d-8be1-4670-b063-e83cf65b1dae): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:30 crc kubenswrapper[4745]: E1209 11:50:30.924201 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8" Dec 09 11:50:30 crc kubenswrapper[4745]: E1209 11:50:30.925076 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l4rjr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-667bd8d554-4q2bv_openstack-operators(e98864fd-e07e-4f9f-a9c1-b55c69a26922): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:31 crc kubenswrapper[4745]: E1209 11:50:31.931411 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:981b6a8f95934a86c5f10ef6e198b07265aeba7f11cf84b9ccd13dfaf06f3ca3" Dec 09 11:50:31 crc kubenswrapper[4745]: E1209 11:50:31.931652 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:981b6a8f95934a86c5f10ef6e198b07265aeba7f11cf84b9ccd13dfaf06f3ca3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lvndq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-6c677c69b-8h2wf_openstack-operators(ae7706be-4789-4153-adf4-9abb8e4ee8d8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:32 crc kubenswrapper[4745]: E1209 11:50:32.749951 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a" Dec 09 11:50:32 crc kubenswrapper[4745]: E1209 11:50:32.750155 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7rwgz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5b5fd79c9c-dxc2l_openstack-operators(8b171fb8-d84f-4b45-bbd9-8cf4dfff0a4d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:33 crc kubenswrapper[4745]: E1209 11:50:33.436459 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 09 11:50:33 crc kubenswrapper[4745]: E1209 11:50:33.436688 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z2wwh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-h44ls_openstack-operators(faa047d4-ea12-493a-aa0a-b429e0c5a123): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:33 crc kubenswrapper[4745]: E1209 11:50:33.940836 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 09 11:50:33 crc kubenswrapper[4745]: E1209 11:50:33.941053 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wdw5t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-2tl6s_openstack-operators(3e400982-edc3-462a-8917-5857ff6dd61e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:34 crc kubenswrapper[4745]: E1209 11:50:34.523919 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 09 11:50:34 crc kubenswrapper[4745]: E1209 11:50:34.524646 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6dvj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-jw6tj_openstack-operators(23db7946-4921-4ac3-aea8-55abc4c4ba1c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:35 crc kubenswrapper[4745]: E1209 11:50:35.428318 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991" Dec 09 11:50:35 crc kubenswrapper[4745]: E1209 11:50:35.428572 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4v8j4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-dfwwk_openstack-operators(8221e85a-690c-4344-9bd7-5adc0e40b513): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:36 crc kubenswrapper[4745]: E1209 11:50:36.101640 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 09 11:50:36 crc kubenswrapper[4745]: E1209 11:50:36.101869 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7nwl9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-7jfkb_openstack-operators(97f9d87f-9f72-452a-ba66-29c916324b43): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:36 crc kubenswrapper[4745]: E1209 11:50:36.767712 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 09 11:50:36 crc kubenswrapper[4745]: E1209 11:50:36.767988 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x8r88,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-klghd_openstack-operators(cbee1192-ad85-44db-9457-df5de60b7047): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:36 crc kubenswrapper[4745]: E1209 11:50:36.769212 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-klghd" podUID="cbee1192-ad85-44db-9457-df5de60b7047" Dec 09 11:50:37 crc kubenswrapper[4745]: E1209 11:50:37.100483 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-klghd" podUID="cbee1192-ad85-44db-9457-df5de60b7047" Dec 09 11:50:37 crc kubenswrapper[4745]: E1209 11:50:37.381733 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 09 11:50:37 crc kubenswrapper[4745]: E1209 11:50:37.381944 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fvrsv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-662j9_openstack-operators(6354cdc2-296f-4479-a46e-9f2be37c4eef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:37 crc kubenswrapper[4745]: E1209 11:50:37.904662 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 09 11:50:37 crc kubenswrapper[4745]: E1209 11:50:37.904848 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2n5nk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-cxl42_openstack-operators(641a4ff2-1fa2-4402-a313-d27dcc9c4294): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:38 crc kubenswrapper[4745]: I1209 11:50:38.878873 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47"] Dec 09 11:50:38 crc kubenswrapper[4745]: I1209 11:50:38.937576 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt"] Dec 09 11:50:38 crc kubenswrapper[4745]: I1209 11:50:38.942692 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl"] Dec 09 11:50:39 crc kubenswrapper[4745]: W1209 11:50:39.384207 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6368805c_1aab_4425_b599_671f89f30110.slice/crio-a5393f45cee679036cc6dac3ce139b970471f8d741660c353f756756555953d4 WatchSource:0}: Error finding container a5393f45cee679036cc6dac3ce139b970471f8d741660c353f756756555953d4: Status 404 returned error can't find the container with id a5393f45cee679036cc6dac3ce139b970471f8d741660c353f756756555953d4 Dec 09 11:50:39 crc kubenswrapper[4745]: E1209 11:50:39.398616 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 09 11:50:39 crc kubenswrapper[4745]: E1209 11:50:39.399135 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6nnsb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-f582k_openstack-operators(6267400f-bf32-4ce3-b875-831cee43fc17): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:50:39 crc kubenswrapper[4745]: E1209 11:50:39.400767 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-f582k" podUID="6267400f-bf32-4ce3-b875-831cee43fc17" Dec 09 11:50:39 crc kubenswrapper[4745]: E1209 11:50:39.811212 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-662j9" podUID="6354cdc2-296f-4479-a46e-9f2be37c4eef" Dec 09 11:50:39 crc kubenswrapper[4745]: E1209 11:50:39.822503 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" podUID="23db7946-4921-4ac3-aea8-55abc4c4ba1c" Dec 09 11:50:39 crc kubenswrapper[4745]: E1209 11:50:39.853180 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-np88r" podUID="40fc90a5-6278-4b54-8260-728265a2501a" Dec 09 11:50:40 crc kubenswrapper[4745]: E1209 11:50:40.187080 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ljgwk" podUID="b026a92d-e1be-43c9-8e7a-1f66260bab18" Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.205996 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" event={"ID":"6368805c-1aab-4425-b599-671f89f30110","Type":"ContainerStarted","Data":"a5393f45cee679036cc6dac3ce139b970471f8d741660c353f756756555953d4"} Dec 09 11:50:40 crc kubenswrapper[4745]: E1209 11:50:40.216921 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-sch7g" podUID="336faca3-d04c-4ab5-b5dd-d9f031c80c64" Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.241793 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-bngxs" event={"ID":"668cdfd0-6b54-4750-b58f-97b26180b203","Type":"ContainerStarted","Data":"5a0e054f6fbf4f321cebcf1c7ee48fcd38ef11ce8587a91fa2f87489052838da"} Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.241836 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-bngxs" event={"ID":"668cdfd0-6b54-4750-b58f-97b26180b203","Type":"ContainerStarted","Data":"41f57ca9873934f37e27adcc9b9172f6439a17477801fffd4751122f7bd8a14c"} Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.242551 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-bngxs" Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.252013 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" event={"ID":"23db7946-4921-4ac3-aea8-55abc4c4ba1c","Type":"ContainerStarted","Data":"00597e19118043891035885b8d9228f25f4b2243f82423a4773df06700d1a6c4"} Dec 09 11:50:40 crc kubenswrapper[4745]: E1209 11:50:40.260413 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" podUID="23db7946-4921-4ac3-aea8-55abc4c4ba1c" Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.280700 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-bngxs" podStartSLOduration=9.516957243 podStartE2EDuration="55.280682965s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:48.749042762 +0000 UTC m=+1075.574244286" lastFinishedPulling="2025-12-09 11:50:34.512768484 +0000 UTC m=+1121.337970008" observedRunningTime="2025-12-09 11:50:40.276092271 +0000 UTC m=+1127.101293795" watchObservedRunningTime="2025-12-09 11:50:40.280682965 +0000 UTC m=+1127.105884489" Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.281468 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" event={"ID":"5dcc5255-fdaa-463a-960c-5bc89c469a25","Type":"ContainerStarted","Data":"a2fe71e3fde13d82be3ab6eb89604b37bcb967f3b7d737d9389a6c0377578f75"} Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.291612 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-np88r" event={"ID":"40fc90a5-6278-4b54-8260-728265a2501a","Type":"ContainerStarted","Data":"cee496ce21e0e92b7567d50a032f6d040ac4e11d58f5baaaf53595ebe32c5d4b"} Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.303929 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-662j9" event={"ID":"6354cdc2-296f-4479-a46e-9f2be37c4eef","Type":"ContainerStarted","Data":"4810d2fac16f9331848837866488f304dd72d35625297a1d9246f6d3a7871c5f"} Dec 09 11:50:40 crc kubenswrapper[4745]: E1209 11:50:40.306807 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls" podUID="faa047d4-ea12-493a-aa0a-b429e0c5a123" Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.307586 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5wtvn" event={"ID":"a8ec2c42-ff99-42c1-b0bb-e362207f4e3e","Type":"ContainerStarted","Data":"68ffadbc09ec91c68ee20c73a7cb9f24293731527d59aed03bf42d2dc7222409"} Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.307616 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5wtvn" event={"ID":"a8ec2c42-ff99-42c1-b0bb-e362207f4e3e","Type":"ContainerStarted","Data":"14c9540df473a7cb73253bcc47d6d3b189514b2be25dd0100013c6bd8046997b"} Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.308090 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5wtvn" Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.335957 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ljgwk" event={"ID":"b026a92d-e1be-43c9-8e7a-1f66260bab18","Type":"ContainerStarted","Data":"b1ef8a808cc186607842e9f3095b8d7bd3afaf68b14f6c2a726db705ca1473d7"} Dec 09 11:50:40 crc kubenswrapper[4745]: E1209 11:50:40.338722 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-662j9" podUID="6354cdc2-296f-4479-a46e-9f2be37c4eef" Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.355324 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" event={"ID":"4fb78653-78a7-4840-8bd4-1ac08145a845","Type":"ContainerStarted","Data":"8f86201b192f02bb8c45d502501ca9dae1db25e558a5c8f6c133a7a6ea4a6246"} Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.355367 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" event={"ID":"4fb78653-78a7-4840-8bd4-1ac08145a845","Type":"ContainerStarted","Data":"7ccc2b178119860615d242a77d6e388f290fdef6540fabb35b6caf82dace3d11"} Dec 09 11:50:40 crc kubenswrapper[4745]: E1209 11:50:40.383299 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s" podUID="3e400982-edc3-462a-8917-5857ff6dd61e" Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.445361 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5wtvn" podStartSLOduration=10.294978442 podStartE2EDuration="55.445325088s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:49.355729498 +0000 UTC m=+1076.180931022" lastFinishedPulling="2025-12-09 11:50:34.506076144 +0000 UTC m=+1121.331277668" observedRunningTime="2025-12-09 11:50:40.433221382 +0000 UTC m=+1127.258422906" watchObservedRunningTime="2025-12-09 11:50:40.445325088 +0000 UTC m=+1127.270526612" Dec 09 11:50:40 crc kubenswrapper[4745]: E1209 11:50:40.456523 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk" podUID="8221e85a-690c-4344-9bd7-5adc0e40b513" Dec 09 11:50:40 crc kubenswrapper[4745]: I1209 11:50:40.586926 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" podStartSLOduration=54.586905831 podStartE2EDuration="54.586905831s" podCreationTimestamp="2025-12-09 11:49:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:50:40.572496983 +0000 UTC m=+1127.397698527" watchObservedRunningTime="2025-12-09 11:50:40.586905831 +0000 UTC m=+1127.412107355" Dec 09 11:50:40 crc kubenswrapper[4745]: E1209 11:50:40.747153 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-967d97867-mmpfw" podUID="62cbd8ba-a7e1-40ed-9e2c-8553a84ea6aa" Dec 09 11:50:40 crc kubenswrapper[4745]: E1209 11:50:40.750290 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42" podUID="641a4ff2-1fa2-4402-a313-d27dcc9c4294" Dec 09 11:50:40 crc kubenswrapper[4745]: E1209 11:50:40.764787 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" podUID="97f9d87f-9f72-452a-ba66-29c916324b43" Dec 09 11:50:40 crc kubenswrapper[4745]: E1209 11:50:40.774074 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-8h2wf" podUID="ae7706be-4789-4153-adf4-9abb8e4ee8d8" Dec 09 11:50:41 crc kubenswrapper[4745]: I1209 11:50:41.380668 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42" event={"ID":"641a4ff2-1fa2-4402-a313-d27dcc9c4294","Type":"ContainerStarted","Data":"8dc4ec3bf37599bdc2de90e85c29598d4946cd3282998825b1a0e0e06fabcb47"} Dec 09 11:50:41 crc kubenswrapper[4745]: E1209 11:50:41.387178 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42" podUID="641a4ff2-1fa2-4402-a313-d27dcc9c4294" Dec 09 11:50:41 crc kubenswrapper[4745]: I1209 11:50:41.405987 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-mmpfw" event={"ID":"62cbd8ba-a7e1-40ed-9e2c-8553a84ea6aa","Type":"ContainerStarted","Data":"e35eae9af24f6e827d4b1dcd13958868b6665eb171b3c1acb30e64ec75b4e460"} Dec 09 11:50:41 crc kubenswrapper[4745]: I1209 11:50:41.424035 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-sch7g" event={"ID":"336faca3-d04c-4ab5-b5dd-d9f031c80c64","Type":"ContainerStarted","Data":"0ccc3f616dcccc99cce46297bbfdcdbe75869660f188c14c5416ff931ea4778b"} Dec 09 11:50:41 crc kubenswrapper[4745]: I1209 11:50:41.430022 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" event={"ID":"97f9d87f-9f72-452a-ba66-29c916324b43","Type":"ContainerStarted","Data":"17196e3e5368641236e465d01d7a50bed66b6b333f998085b2f9007a4993e4d4"} Dec 09 11:50:41 crc kubenswrapper[4745]: E1209 11:50:41.432572 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" podUID="97f9d87f-9f72-452a-ba66-29c916324b43" Dec 09 11:50:41 crc kubenswrapper[4745]: I1209 11:50:41.433910 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s" event={"ID":"3e400982-edc3-462a-8917-5857ff6dd61e","Type":"ContainerStarted","Data":"57be4029a5cc234e4de7e2a24427e2b7ccccc0877ee4b38fb82d1095ec8f843f"} Dec 09 11:50:41 crc kubenswrapper[4745]: E1209 11:50:41.436683 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s" podUID="3e400982-edc3-462a-8917-5857ff6dd61e" Dec 09 11:50:41 crc kubenswrapper[4745]: I1209 11:50:41.437985 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk" event={"ID":"8221e85a-690c-4344-9bd7-5adc0e40b513","Type":"ContainerStarted","Data":"10c679bf427cd4f38a923b984bfa6c730997052d2986b789ba969088c3b7edb4"} Dec 09 11:50:41 crc kubenswrapper[4745]: E1209 11:50:41.555529 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk" podUID="8221e85a-690c-4344-9bd7-5adc0e40b513" Dec 09 11:50:41 crc kubenswrapper[4745]: I1209 11:50:41.639700 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-8h2wf" event={"ID":"ae7706be-4789-4153-adf4-9abb8e4ee8d8","Type":"ContainerStarted","Data":"19019be825735e7af4697967d959693828d695f2827e58b8319f24900d5ca07b"} Dec 09 11:50:41 crc kubenswrapper[4745]: E1209 11:50:41.659794 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:981b6a8f95934a86c5f10ef6e198b07265aeba7f11cf84b9ccd13dfaf06f3ca3\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-8h2wf" podUID="ae7706be-4789-4153-adf4-9abb8e4ee8d8" Dec 09 11:50:41 crc kubenswrapper[4745]: I1209 11:50:41.668740 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls" event={"ID":"faa047d4-ea12-493a-aa0a-b429e0c5a123","Type":"ContainerStarted","Data":"a860ce89d2792a02e959555cb073865b97237cf3a0e3b9e2cec9f51c26e79cce"} Dec 09 11:50:41 crc kubenswrapper[4745]: I1209 11:50:41.669998 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:50:41 crc kubenswrapper[4745]: E1209 11:50:41.699961 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-662j9" podUID="6354cdc2-296f-4479-a46e-9f2be37c4eef" Dec 09 11:50:41 crc kubenswrapper[4745]: E1209 11:50:41.699961 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls" podUID="faa047d4-ea12-493a-aa0a-b429e0c5a123" Dec 09 11:50:42 crc kubenswrapper[4745]: E1209 11:50:42.021252 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-dxc2l" podUID="8b171fb8-d84f-4b45-bbd9-8cf4dfff0a4d" Dec 09 11:50:42 crc kubenswrapper[4745]: I1209 11:50:42.698660 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-mmpfw" event={"ID":"62cbd8ba-a7e1-40ed-9e2c-8553a84ea6aa","Type":"ContainerStarted","Data":"58a7763075411e99701a75a62ea88249e0216c034a32683c0d7aedab9abe0f52"} Dec 09 11:50:42 crc kubenswrapper[4745]: I1209 11:50:42.700487 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-967d97867-mmpfw" Dec 09 11:50:42 crc kubenswrapper[4745]: I1209 11:50:42.703674 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-np88r" event={"ID":"40fc90a5-6278-4b54-8260-728265a2501a","Type":"ContainerStarted","Data":"77ec0e4b10b30677f5cd754c776c6d99651e9833225622655d8a805046c9db83"} Dec 09 11:50:42 crc kubenswrapper[4745]: I1209 11:50:42.704326 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-np88r" Dec 09 11:50:42 crc kubenswrapper[4745]: I1209 11:50:42.712990 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-sch7g" event={"ID":"336faca3-d04c-4ab5-b5dd-d9f031c80c64","Type":"ContainerStarted","Data":"839c5e6d4f3d5d71ab491f4eed486dedf932fe3de391627ef065a68f44f3af0f"} Dec 09 11:50:42 crc kubenswrapper[4745]: I1209 11:50:42.713097 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-sch7g" Dec 09 11:50:42 crc kubenswrapper[4745]: I1209 11:50:42.716392 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ljgwk" event={"ID":"b026a92d-e1be-43c9-8e7a-1f66260bab18","Type":"ContainerStarted","Data":"709bce6f029e2881b8f4005b1c367ca367314cf3da6794bbb9a8e52c8e4e54bd"} Dec 09 11:50:42 crc kubenswrapper[4745]: I1209 11:50:42.717181 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ljgwk" Dec 09 11:50:42 crc kubenswrapper[4745]: I1209 11:50:42.723634 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-dxc2l" event={"ID":"8b171fb8-d84f-4b45-bbd9-8cf4dfff0a4d","Type":"ContainerStarted","Data":"64739d2e84bd1409b96ce8f94f6a149b9637f5c3b81038ee06419be02cf85bd3"} Dec 09 11:50:42 crc kubenswrapper[4745]: E1209 11:50:42.728336 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42" podUID="641a4ff2-1fa2-4402-a313-d27dcc9c4294" Dec 09 11:50:42 crc kubenswrapper[4745]: E1209 11:50:42.728437 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s" podUID="3e400982-edc3-462a-8917-5857ff6dd61e" Dec 09 11:50:42 crc kubenswrapper[4745]: E1209 11:50:42.728490 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a\\\"\"" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-dxc2l" podUID="8b171fb8-d84f-4b45-bbd9-8cf4dfff0a4d" Dec 09 11:50:42 crc kubenswrapper[4745]: E1209 11:50:42.728623 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls" podUID="faa047d4-ea12-493a-aa0a-b429e0c5a123" Dec 09 11:50:42 crc kubenswrapper[4745]: I1209 11:50:42.728886 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-967d97867-mmpfw" podStartSLOduration=4.549147747 podStartE2EDuration="57.728866026s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:48.837684119 +0000 UTC m=+1075.662885643" lastFinishedPulling="2025-12-09 11:50:42.017402398 +0000 UTC m=+1128.842603922" observedRunningTime="2025-12-09 11:50:42.728090145 +0000 UTC m=+1129.553291669" watchObservedRunningTime="2025-12-09 11:50:42.728866026 +0000 UTC m=+1129.554067550" Dec 09 11:50:42 crc kubenswrapper[4745]: I1209 11:50:42.749133 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ljgwk" podStartSLOduration=6.364757855 podStartE2EDuration="57.749103841s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:49.7836385 +0000 UTC m=+1076.608840024" lastFinishedPulling="2025-12-09 11:50:41.167984486 +0000 UTC m=+1127.993186010" observedRunningTime="2025-12-09 11:50:42.745840213 +0000 UTC m=+1129.571041757" watchObservedRunningTime="2025-12-09 11:50:42.749103841 +0000 UTC m=+1129.574305375" Dec 09 11:50:42 crc kubenswrapper[4745]: E1209 11:50:42.811983 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-85z5r" podUID="cbb6789a-1426-4ea2-aa2b-76959271ffc2" Dec 09 11:50:42 crc kubenswrapper[4745]: I1209 11:50:42.864354 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-sch7g" podStartSLOduration=5.301580317 podStartE2EDuration="57.864328533s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:49.702614619 +0000 UTC m=+1076.527816143" lastFinishedPulling="2025-12-09 11:50:42.265362835 +0000 UTC m=+1129.090564359" observedRunningTime="2025-12-09 11:50:42.82669841 +0000 UTC m=+1129.651899934" watchObservedRunningTime="2025-12-09 11:50:42.864328533 +0000 UTC m=+1129.689530047" Dec 09 11:50:42 crc kubenswrapper[4745]: I1209 11:50:42.926719 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-np88r" podStartSLOduration=6.563177368 podStartE2EDuration="57.926699483s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:49.836949396 +0000 UTC m=+1076.662150920" lastFinishedPulling="2025-12-09 11:50:41.200471511 +0000 UTC m=+1128.025673035" observedRunningTime="2025-12-09 11:50:42.92063701 +0000 UTC m=+1129.745838534" watchObservedRunningTime="2025-12-09 11:50:42.926699483 +0000 UTC m=+1129.751901007" Dec 09 11:50:42 crc kubenswrapper[4745]: E1209 11:50:42.930097 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-4q2bv" podUID="e98864fd-e07e-4f9f-a9c1-b55c69a26922" Dec 09 11:50:43 crc kubenswrapper[4745]: E1209 11:50:43.491919 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-6vwcq" podUID="42df7b5d-8be1-4670-b063-e83cf65b1dae" Dec 09 11:50:43 crc kubenswrapper[4745]: I1209 11:50:43.731634 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-85z5r" event={"ID":"cbb6789a-1426-4ea2-aa2b-76959271ffc2","Type":"ContainerStarted","Data":"0fbf35437856510494bbe85d6891687594e384f6bd5bca6cf9dc29c8d498ed99"} Dec 09 11:50:43 crc kubenswrapper[4745]: I1209 11:50:43.736175 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-4q2bv" event={"ID":"e98864fd-e07e-4f9f-a9c1-b55c69a26922","Type":"ContainerStarted","Data":"de0416d984f56514379059cdd55b2725bb022c66d46cb5770824f3f10c27c081"} Dec 09 11:50:43 crc kubenswrapper[4745]: I1209 11:50:43.741491 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-8h2wf" event={"ID":"ae7706be-4789-4153-adf4-9abb8e4ee8d8","Type":"ContainerStarted","Data":"f5cde002740287f84f0f3f6d12039652fcd731aa9a6bd3361c46e3a9888dce76"} Dec 09 11:50:43 crc kubenswrapper[4745]: I1209 11:50:43.741729 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-8h2wf" Dec 09 11:50:43 crc kubenswrapper[4745]: I1209 11:50:43.743236 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-6vwcq" event={"ID":"42df7b5d-8be1-4670-b063-e83cf65b1dae","Type":"ContainerStarted","Data":"0e8ce03f5a1ed5ad3185134b6838aecf7f7d3f93432052cba6a47a27d215ea1d"} Dec 09 11:50:44 crc kubenswrapper[4745]: I1209 11:50:43.997714 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-8h2wf" podStartSLOduration=4.506638102 podStartE2EDuration="58.997697792s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:48.85593235 +0000 UTC m=+1075.681133874" lastFinishedPulling="2025-12-09 11:50:43.34699205 +0000 UTC m=+1130.172193564" observedRunningTime="2025-12-09 11:50:43.991903276 +0000 UTC m=+1130.817104820" watchObservedRunningTime="2025-12-09 11:50:43.997697792 +0000 UTC m=+1130.822899316" Dec 09 11:50:45 crc kubenswrapper[4745]: I1209 11:50:45.911839 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-bngxs" Dec 09 11:50:46 crc kubenswrapper[4745]: I1209 11:50:46.092525 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5wtvn" Dec 09 11:50:47 crc kubenswrapper[4745]: I1209 11:50:47.122089 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-np88r" Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.386374 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-85z5r" event={"ID":"cbb6789a-1426-4ea2-aa2b-76959271ffc2","Type":"ContainerStarted","Data":"4adef5e1af6ff05362f0df39fd99449ba7730960492c8f3891145ff6d93360cf"} Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.387231 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-85z5r" Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.392133 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-dxc2l" event={"ID":"8b171fb8-d84f-4b45-bbd9-8cf4dfff0a4d","Type":"ContainerStarted","Data":"a7bb951380c676cacc60bed233c355f6561c5708972432292664765be03d2862"} Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.393062 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-dxc2l" Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.396201 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-klghd" event={"ID":"cbee1192-ad85-44db-9457-df5de60b7047","Type":"ContainerStarted","Data":"00285d1c9e0da9fdde502e2928c01a5b5bbfe36b1325d73f9118b0d69709addc"} Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.406400 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-f582k" event={"ID":"6267400f-bf32-4ce3-b875-831cee43fc17","Type":"ContainerStarted","Data":"86396245e8ab529909defbf40c9a4eb8ab33452aaca8e6925d313fb8320ee6ab"} Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.413213 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-4q2bv" event={"ID":"e98864fd-e07e-4f9f-a9c1-b55c69a26922","Type":"ContainerStarted","Data":"2fb9e38431d3c4ad93d3483fdfd2f7ae66795e7dbe53acbda213f15bacc65175"} Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.413452 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-4q2bv" Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.418352 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-85z5r" podStartSLOduration=5.393269506 podStartE2EDuration="1m6.418332684s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:49.783658131 +0000 UTC m=+1076.608859655" lastFinishedPulling="2025-12-09 11:50:50.808721309 +0000 UTC m=+1137.633922833" observedRunningTime="2025-12-09 11:50:51.410662798 +0000 UTC m=+1138.235864322" watchObservedRunningTime="2025-12-09 11:50:51.418332684 +0000 UTC m=+1138.243534208" Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.419370 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" event={"ID":"5dcc5255-fdaa-463a-960c-5bc89c469a25","Type":"ContainerStarted","Data":"ced187cdc0aa6b337898a94200f7c59796284637b27da6714c060815a846f8af"} Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.430595 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-6vwcq" event={"ID":"42df7b5d-8be1-4670-b063-e83cf65b1dae","Type":"ContainerStarted","Data":"421386d0e38ebb5eb8d50ee7d193b12e1d2e016572546958b9ec4fe191151f11"} Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.430870 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-6vwcq" Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.432033 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-dxc2l" podStartSLOduration=5.18990376 podStartE2EDuration="1m6.432011192s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:49.560636866 +0000 UTC m=+1076.385838390" lastFinishedPulling="2025-12-09 11:50:50.802744298 +0000 UTC m=+1137.627945822" observedRunningTime="2025-12-09 11:50:51.426892605 +0000 UTC m=+1138.252094149" watchObservedRunningTime="2025-12-09 11:50:51.432011192 +0000 UTC m=+1138.257212706" Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.438809 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" event={"ID":"6368805c-1aab-4425-b599-671f89f30110","Type":"ContainerStarted","Data":"d97091c33ee45f7ec964f0e4bcccd9e4c3fd69a1b1f13d670d8ea00aa1a5ded3"} Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.465855 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-klghd" podStartSLOduration=4.554484707 podStartE2EDuration="1m5.465823743s" podCreationTimestamp="2025-12-09 11:49:46 +0000 UTC" firstStartedPulling="2025-12-09 11:49:49.890239061 +0000 UTC m=+1076.715440585" lastFinishedPulling="2025-12-09 11:50:50.801578097 +0000 UTC m=+1137.626779621" observedRunningTime="2025-12-09 11:50:51.453895482 +0000 UTC m=+1138.279097006" watchObservedRunningTime="2025-12-09 11:50:51.465823743 +0000 UTC m=+1138.291025267" Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.485469 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-4q2bv" podStartSLOduration=4.569009218 podStartE2EDuration="1m5.485443651s" podCreationTimestamp="2025-12-09 11:49:46 +0000 UTC" firstStartedPulling="2025-12-09 11:49:49.889634895 +0000 UTC m=+1076.714836419" lastFinishedPulling="2025-12-09 11:50:50.806069318 +0000 UTC m=+1137.631270852" observedRunningTime="2025-12-09 11:50:51.478863574 +0000 UTC m=+1138.304065098" watchObservedRunningTime="2025-12-09 11:50:51.485443651 +0000 UTC m=+1138.310645175" Dec 09 11:50:51 crc kubenswrapper[4745]: I1209 11:50:51.498672 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-6vwcq" podStartSLOduration=4.312999398 podStartE2EDuration="1m6.498654737s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:48.61750035 +0000 UTC m=+1075.442701874" lastFinishedPulling="2025-12-09 11:50:50.803155689 +0000 UTC m=+1137.628357213" observedRunningTime="2025-12-09 11:50:51.497914977 +0000 UTC m=+1138.323116501" watchObservedRunningTime="2025-12-09 11:50:51.498654737 +0000 UTC m=+1138.323856261" Dec 09 11:50:52 crc kubenswrapper[4745]: I1209 11:50:52.457807 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" event={"ID":"6368805c-1aab-4425-b599-671f89f30110","Type":"ContainerStarted","Data":"7069f2d5cc45f8ffad05a69caabee5090895702585ab5348990b1a101bdccf47"} Dec 09 11:50:52 crc kubenswrapper[4745]: I1209 11:50:52.458656 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" Dec 09 11:50:52 crc kubenswrapper[4745]: I1209 11:50:52.462471 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-f582k" event={"ID":"6267400f-bf32-4ce3-b875-831cee43fc17","Type":"ContainerStarted","Data":"08bdd01398dfcec3a398820e4a49d001c3227edaa742d794715a5b6d32e831c9"} Dec 09 11:50:52 crc kubenswrapper[4745]: I1209 11:50:52.463189 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-f582k" Dec 09 11:50:52 crc kubenswrapper[4745]: I1209 11:50:52.466847 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" event={"ID":"5dcc5255-fdaa-463a-960c-5bc89c469a25","Type":"ContainerStarted","Data":"40e1ef43000d4f38d14389dddf8bb1f780e0bd1f3aa3ea54cb1981c98f78a78a"} Dec 09 11:50:52 crc kubenswrapper[4745]: I1209 11:50:52.466932 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" Dec 09 11:50:52 crc kubenswrapper[4745]: I1209 11:50:52.491201 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" podStartSLOduration=56.089382429 podStartE2EDuration="1m7.491180701s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:50:39.387274018 +0000 UTC m=+1126.212475562" lastFinishedPulling="2025-12-09 11:50:50.78907231 +0000 UTC m=+1137.614273834" observedRunningTime="2025-12-09 11:50:52.488741345 +0000 UTC m=+1139.313942869" watchObservedRunningTime="2025-12-09 11:50:52.491180701 +0000 UTC m=+1139.316382225" Dec 09 11:50:52 crc kubenswrapper[4745]: I1209 11:50:52.517343 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-f582k" podStartSLOduration=6.730728507 podStartE2EDuration="1m7.517322135s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:50.015079982 +0000 UTC m=+1076.840281506" lastFinishedPulling="2025-12-09 11:50:50.80167361 +0000 UTC m=+1137.626875134" observedRunningTime="2025-12-09 11:50:52.512791453 +0000 UTC m=+1139.337992987" watchObservedRunningTime="2025-12-09 11:50:52.517322135 +0000 UTC m=+1139.342523649" Dec 09 11:50:52 crc kubenswrapper[4745]: I1209 11:50:52.547290 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" podStartSLOduration=56.176803142 podStartE2EDuration="1m7.547273331s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:50:39.388274345 +0000 UTC m=+1126.213475869" lastFinishedPulling="2025-12-09 11:50:50.758744534 +0000 UTC m=+1137.583946058" observedRunningTime="2025-12-09 11:50:52.543250563 +0000 UTC m=+1139.368452087" watchObservedRunningTime="2025-12-09 11:50:52.547273331 +0000 UTC m=+1139.372474855" Dec 09 11:50:52 crc kubenswrapper[4745]: E1209 11:50:52.557387 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" podUID="97f9d87f-9f72-452a-ba66-29c916324b43" Dec 09 11:50:53 crc kubenswrapper[4745]: I1209 11:50:53.079756 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-668858c49-7ffqt" Dec 09 11:50:53 crc kubenswrapper[4745]: E1209 11:50:53.582392 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" podUID="23db7946-4921-4ac3-aea8-55abc4c4ba1c" Dec 09 11:50:55 crc kubenswrapper[4745]: I1209 11:50:55.547180 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:50:55 crc kubenswrapper[4745]: I1209 11:50:55.557573 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:50:55 crc kubenswrapper[4745]: I1209 11:50:55.576373 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:50:55 crc kubenswrapper[4745]: I1209 11:50:55.577224 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae7429356c582a02b4d2b13febbec9379c05cf9bbc3bd01266102d2aeec55a48"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:50:55 crc kubenswrapper[4745]: I1209 11:50:55.577297 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://ae7429356c582a02b4d2b13febbec9379c05cf9bbc3bd01266102d2aeec55a48" gracePeriod=600 Dec 09 11:50:55 crc kubenswrapper[4745]: I1209 11:50:55.940875 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ljgwk" Dec 09 11:50:56 crc kubenswrapper[4745]: I1209 11:50:56.046153 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-967d97867-mmpfw" Dec 09 11:50:56 crc kubenswrapper[4745]: I1209 11:50:56.131930 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-8h2wf" Dec 09 11:50:56 crc kubenswrapper[4745]: I1209 11:50:56.323235 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-sch7g" Dec 09 11:50:56 crc kubenswrapper[4745]: I1209 11:50:56.367970 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-dxc2l" Dec 09 11:50:56 crc kubenswrapper[4745]: I1209 11:50:56.741313 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-85z5r" Dec 09 11:50:56 crc kubenswrapper[4745]: I1209 11:50:56.959343 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-f582k" Dec 09 11:50:57 crc kubenswrapper[4745]: I1209 11:50:57.133411 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-4q2bv" Dec 09 11:50:58 crc kubenswrapper[4745]: I1209 11:50:58.357329 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47" Dec 09 11:50:58 crc kubenswrapper[4745]: I1209 11:50:58.360720 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-d2ksl" Dec 09 11:51:00 crc kubenswrapper[4745]: I1209 11:51:00.793821 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="ae7429356c582a02b4d2b13febbec9379c05cf9bbc3bd01266102d2aeec55a48" exitCode=0 Dec 09 11:51:00 crc kubenswrapper[4745]: I1209 11:51:00.793911 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"ae7429356c582a02b4d2b13febbec9379c05cf9bbc3bd01266102d2aeec55a48"} Dec 09 11:51:00 crc kubenswrapper[4745]: I1209 11:51:00.794110 4745 scope.go:117] "RemoveContainer" containerID="6250b21aafb6740efccc2c70a8117a20c7fea1d66182660852cc7179196a393f" Dec 09 11:51:05 crc kubenswrapper[4745]: I1209 11:51:05.858184 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-6vwcq" Dec 09 11:51:06 crc kubenswrapper[4745]: I1209 11:51:06.914729 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" event={"ID":"97f9d87f-9f72-452a-ba66-29c916324b43","Type":"ContainerStarted","Data":"2e41f4079e140732c52a83955f7371ea55f0f64e1153409b85e8be9fefb563e0"} Dec 09 11:51:06 crc kubenswrapper[4745]: I1209 11:51:06.915190 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" Dec 09 11:51:06 crc kubenswrapper[4745]: I1209 11:51:06.916717 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s" event={"ID":"3e400982-edc3-462a-8917-5857ff6dd61e","Type":"ContainerStarted","Data":"abfa28b7478ff58d6e2c30ef702b92e6da276f6ab65effdb07fe94828c29d2f2"} Dec 09 11:51:06 crc kubenswrapper[4745]: I1209 11:51:06.916873 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s" Dec 09 11:51:06 crc kubenswrapper[4745]: I1209 11:51:06.918614 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk" event={"ID":"8221e85a-690c-4344-9bd7-5adc0e40b513","Type":"ContainerStarted","Data":"35b54f770813ec36a4a1a44a05ef28a53ac7cc2252a1386473028ffd2c05a56e"} Dec 09 11:51:06 crc kubenswrapper[4745]: I1209 11:51:06.919291 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk" Dec 09 11:51:06 crc kubenswrapper[4745]: I1209 11:51:06.922555 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"254488b12cfc6e65d01192f108f9d8847d5257e1f8c39a968b3046b52ec176b8"} Dec 09 11:51:06 crc kubenswrapper[4745]: I1209 11:51:06.925057 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls" event={"ID":"faa047d4-ea12-493a-aa0a-b429e0c5a123","Type":"ContainerStarted","Data":"4b137b95cec65a21d6bf1557fee7b355382ea0345145ef05865a54b529f42600"} Dec 09 11:51:06 crc kubenswrapper[4745]: I1209 11:51:06.925333 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls" Dec 09 11:51:06 crc kubenswrapper[4745]: I1209 11:51:06.930835 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42" event={"ID":"641a4ff2-1fa2-4402-a313-d27dcc9c4294","Type":"ContainerStarted","Data":"8fff6b1df4119ef111fdcb72b4f031622ed9232601b261824e1db4eac8f14279"} Dec 09 11:51:06 crc kubenswrapper[4745]: I1209 11:51:06.931040 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42" Dec 09 11:51:06 crc kubenswrapper[4745]: I1209 11:51:06.936694 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-662j9" event={"ID":"6354cdc2-296f-4479-a46e-9f2be37c4eef","Type":"ContainerStarted","Data":"ab3940878d1ce8d9c2e1c0759b22c28710e2b1b8a5c99d03c50e691128dd42da"} Dec 09 11:51:06 crc kubenswrapper[4745]: I1209 11:51:06.936922 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-662j9" Dec 09 11:51:06 crc kubenswrapper[4745]: I1209 11:51:06.974123 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" podStartSLOduration=5.813223074 podStartE2EDuration="1m21.974100376s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:49.994448157 +0000 UTC m=+1076.819649681" lastFinishedPulling="2025-12-09 11:51:06.155325459 +0000 UTC m=+1152.980526983" observedRunningTime="2025-12-09 11:51:06.964915699 +0000 UTC m=+1153.790117233" watchObservedRunningTime="2025-12-09 11:51:06.974100376 +0000 UTC m=+1153.799301900" Dec 09 11:51:07 crc kubenswrapper[4745]: I1209 11:51:07.050066 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-662j9" podStartSLOduration=5.91112322 podStartE2EDuration="1m22.050048601s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:49.70303285 +0000 UTC m=+1076.528234374" lastFinishedPulling="2025-12-09 11:51:05.841958211 +0000 UTC m=+1152.667159755" observedRunningTime="2025-12-09 11:51:07.043456434 +0000 UTC m=+1153.868657958" watchObservedRunningTime="2025-12-09 11:51:07.050048601 +0000 UTC m=+1153.875250125" Dec 09 11:51:07 crc kubenswrapper[4745]: I1209 11:51:07.129356 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls" podStartSLOduration=6.030865743 podStartE2EDuration="1m22.129341166s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:49.725543976 +0000 UTC m=+1076.550745500" lastFinishedPulling="2025-12-09 11:51:05.824019399 +0000 UTC m=+1152.649220923" observedRunningTime="2025-12-09 11:51:07.127051305 +0000 UTC m=+1153.952252849" watchObservedRunningTime="2025-12-09 11:51:07.129341166 +0000 UTC m=+1153.954542680" Dec 09 11:51:07 crc kubenswrapper[4745]: I1209 11:51:07.299040 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk" podStartSLOduration=6.456428943 podStartE2EDuration="1m22.299016605s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:49.994216781 +0000 UTC m=+1076.819418305" lastFinishedPulling="2025-12-09 11:51:05.836804443 +0000 UTC m=+1152.662005967" observedRunningTime="2025-12-09 11:51:07.296691172 +0000 UTC m=+1154.121892696" watchObservedRunningTime="2025-12-09 11:51:07.299016605 +0000 UTC m=+1154.124218129" Dec 09 11:51:07 crc kubenswrapper[4745]: I1209 11:51:07.626458 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42" podStartSLOduration=6.569039985 podStartE2EDuration="1m22.626440542s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:49.78364123 +0000 UTC m=+1076.608842754" lastFinishedPulling="2025-12-09 11:51:05.841041787 +0000 UTC m=+1152.666243311" observedRunningTime="2025-12-09 11:51:07.616429282 +0000 UTC m=+1154.441630816" watchObservedRunningTime="2025-12-09 11:51:07.626440542 +0000 UTC m=+1154.451642066" Dec 09 11:51:07 crc kubenswrapper[4745]: I1209 11:51:07.666660 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s" podStartSLOduration=6.73451956 podStartE2EDuration="1m22.666631904s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:49.890931089 +0000 UTC m=+1076.716132613" lastFinishedPulling="2025-12-09 11:51:05.823043433 +0000 UTC m=+1152.648244957" observedRunningTime="2025-12-09 11:51:07.659954864 +0000 UTC m=+1154.485156388" watchObservedRunningTime="2025-12-09 11:51:07.666631904 +0000 UTC m=+1154.491833428" Dec 09 11:51:10 crc kubenswrapper[4745]: I1209 11:51:09.988907 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" event={"ID":"23db7946-4921-4ac3-aea8-55abc4c4ba1c","Type":"ContainerStarted","Data":"3307eecc4817dfc8fbdc6e64fcb7f638bdedcb5f0bba7337b7c366e9ed736c06"} Dec 09 11:51:10 crc kubenswrapper[4745]: I1209 11:51:09.992847 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" Dec 09 11:51:10 crc kubenswrapper[4745]: I1209 11:51:10.127103 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" podStartSLOduration=6.137533356 podStartE2EDuration="1m25.127084805s" podCreationTimestamp="2025-12-09 11:49:45 +0000 UTC" firstStartedPulling="2025-12-09 11:49:49.9841571 +0000 UTC m=+1076.809358614" lastFinishedPulling="2025-12-09 11:51:08.973708539 +0000 UTC m=+1155.798910063" observedRunningTime="2025-12-09 11:51:10.123012776 +0000 UTC m=+1156.948214300" watchObservedRunningTime="2025-12-09 11:51:10.127084805 +0000 UTC m=+1156.952286329" Dec 09 11:51:15 crc kubenswrapper[4745]: I1209 11:51:15.967085 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h44ls" Dec 09 11:51:16 crc kubenswrapper[4745]: I1209 11:51:16.125743 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-cxl42" Dec 09 11:51:16 crc kubenswrapper[4745]: I1209 11:51:16.748776 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-662j9" Dec 09 11:51:16 crc kubenswrapper[4745]: I1209 11:51:16.807744 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2tl6s" Dec 09 11:51:16 crc kubenswrapper[4745]: I1209 11:51:16.886713 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jw6tj" Dec 09 11:51:17 crc kubenswrapper[4745]: I1209 11:51:17.042339 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7jfkb" Dec 09 11:51:17 crc kubenswrapper[4745]: I1209 11:51:17.111956 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-dfwwk" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.047318 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-596p2"] Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.049204 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-596p2" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.052781 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-h27xl" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.052986 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.056465 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.056583 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.062968 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-596p2"] Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.158305 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3eaea34-0343-402e-a370-0eaf917587d0-config\") pod \"dnsmasq-dns-84bb9d8bd9-596p2\" (UID: \"d3eaea34-0343-402e-a370-0eaf917587d0\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-596p2" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.158492 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xcxk\" (UniqueName: \"kubernetes.io/projected/d3eaea34-0343-402e-a370-0eaf917587d0-kube-api-access-4xcxk\") pod \"dnsmasq-dns-84bb9d8bd9-596p2\" (UID: \"d3eaea34-0343-402e-a370-0eaf917587d0\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-596p2" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.195231 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-zplcs"] Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.196747 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-zplcs" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.201818 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.203452 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-zplcs"] Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.259831 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-config\") pod \"dnsmasq-dns-5f854695bc-zplcs\" (UID: \"1bc4dea0-b1c3-4740-9f46-9255fae9b69b\") " pod="openstack/dnsmasq-dns-5f854695bc-zplcs" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.259900 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xcxk\" (UniqueName: \"kubernetes.io/projected/d3eaea34-0343-402e-a370-0eaf917587d0-kube-api-access-4xcxk\") pod \"dnsmasq-dns-84bb9d8bd9-596p2\" (UID: \"d3eaea34-0343-402e-a370-0eaf917587d0\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-596p2" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.259935 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-dns-svc\") pod \"dnsmasq-dns-5f854695bc-zplcs\" (UID: \"1bc4dea0-b1c3-4740-9f46-9255fae9b69b\") " pod="openstack/dnsmasq-dns-5f854695bc-zplcs" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.259976 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7gb9\" (UniqueName: \"kubernetes.io/projected/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-kube-api-access-f7gb9\") pod \"dnsmasq-dns-5f854695bc-zplcs\" (UID: \"1bc4dea0-b1c3-4740-9f46-9255fae9b69b\") " pod="openstack/dnsmasq-dns-5f854695bc-zplcs" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.260036 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3eaea34-0343-402e-a370-0eaf917587d0-config\") pod \"dnsmasq-dns-84bb9d8bd9-596p2\" (UID: \"d3eaea34-0343-402e-a370-0eaf917587d0\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-596p2" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.261088 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3eaea34-0343-402e-a370-0eaf917587d0-config\") pod \"dnsmasq-dns-84bb9d8bd9-596p2\" (UID: \"d3eaea34-0343-402e-a370-0eaf917587d0\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-596p2" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.278436 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xcxk\" (UniqueName: \"kubernetes.io/projected/d3eaea34-0343-402e-a370-0eaf917587d0-kube-api-access-4xcxk\") pod \"dnsmasq-dns-84bb9d8bd9-596p2\" (UID: \"d3eaea34-0343-402e-a370-0eaf917587d0\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-596p2" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.361714 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-config\") pod \"dnsmasq-dns-5f854695bc-zplcs\" (UID: \"1bc4dea0-b1c3-4740-9f46-9255fae9b69b\") " pod="openstack/dnsmasq-dns-5f854695bc-zplcs" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.362053 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-dns-svc\") pod \"dnsmasq-dns-5f854695bc-zplcs\" (UID: \"1bc4dea0-b1c3-4740-9f46-9255fae9b69b\") " pod="openstack/dnsmasq-dns-5f854695bc-zplcs" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.362086 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7gb9\" (UniqueName: \"kubernetes.io/projected/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-kube-api-access-f7gb9\") pod \"dnsmasq-dns-5f854695bc-zplcs\" (UID: \"1bc4dea0-b1c3-4740-9f46-9255fae9b69b\") " pod="openstack/dnsmasq-dns-5f854695bc-zplcs" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.362774 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-config\") pod \"dnsmasq-dns-5f854695bc-zplcs\" (UID: \"1bc4dea0-b1c3-4740-9f46-9255fae9b69b\") " pod="openstack/dnsmasq-dns-5f854695bc-zplcs" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.362975 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-dns-svc\") pod \"dnsmasq-dns-5f854695bc-zplcs\" (UID: \"1bc4dea0-b1c3-4740-9f46-9255fae9b69b\") " pod="openstack/dnsmasq-dns-5f854695bc-zplcs" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.370424 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-596p2" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.380583 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7gb9\" (UniqueName: \"kubernetes.io/projected/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-kube-api-access-f7gb9\") pod \"dnsmasq-dns-5f854695bc-zplcs\" (UID: \"1bc4dea0-b1c3-4740-9f46-9255fae9b69b\") " pod="openstack/dnsmasq-dns-5f854695bc-zplcs" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.516558 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-zplcs" Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.604683 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-596p2"] Dec 09 11:51:32 crc kubenswrapper[4745]: W1209 11:51:32.627552 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3eaea34_0343_402e_a370_0eaf917587d0.slice/crio-4cfd28671f439e47ae8212f8c4c3d1426f619f34be5525cc1ac1cd23d3430b86 WatchSource:0}: Error finding container 4cfd28671f439e47ae8212f8c4c3d1426f619f34be5525cc1ac1cd23d3430b86: Status 404 returned error can't find the container with id 4cfd28671f439e47ae8212f8c4c3d1426f619f34be5525cc1ac1cd23d3430b86 Dec 09 11:51:32 crc kubenswrapper[4745]: I1209 11:51:32.976564 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-zplcs"] Dec 09 11:51:32 crc kubenswrapper[4745]: W1209 11:51:32.984394 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bc4dea0_b1c3_4740_9f46_9255fae9b69b.slice/crio-44e0c8d152f788909b95d0ac2bca5a66800ad098a75e47f94e27134aaa526437 WatchSource:0}: Error finding container 44e0c8d152f788909b95d0ac2bca5a66800ad098a75e47f94e27134aaa526437: Status 404 returned error can't find the container with id 44e0c8d152f788909b95d0ac2bca5a66800ad098a75e47f94e27134aaa526437 Dec 09 11:51:33 crc kubenswrapper[4745]: I1209 11:51:33.172154 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-596p2" event={"ID":"d3eaea34-0343-402e-a370-0eaf917587d0","Type":"ContainerStarted","Data":"4cfd28671f439e47ae8212f8c4c3d1426f619f34be5525cc1ac1cd23d3430b86"} Dec 09 11:51:33 crc kubenswrapper[4745]: I1209 11:51:33.181018 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-zplcs" event={"ID":"1bc4dea0-b1c3-4740-9f46-9255fae9b69b","Type":"ContainerStarted","Data":"44e0c8d152f788909b95d0ac2bca5a66800ad098a75e47f94e27134aaa526437"} Dec 09 11:51:34 crc kubenswrapper[4745]: I1209 11:51:34.699350 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-zplcs"] Dec 09 11:51:34 crc kubenswrapper[4745]: I1209 11:51:34.742899 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-c4t4v"] Dec 09 11:51:34 crc kubenswrapper[4745]: I1209 11:51:34.747251 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" Dec 09 11:51:34 crc kubenswrapper[4745]: I1209 11:51:34.755834 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-c4t4v"] Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.072895 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhsvt\" (UniqueName: \"kubernetes.io/projected/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-kube-api-access-vhsvt\") pod \"dnsmasq-dns-c7cbb8f79-c4t4v\" (UID: \"e8fd5f7b-2a10-4593-9fed-b8bed2db29af\") " pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.073058 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-config\") pod \"dnsmasq-dns-c7cbb8f79-c4t4v\" (UID: \"e8fd5f7b-2a10-4593-9fed-b8bed2db29af\") " pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.073111 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-dns-svc\") pod \"dnsmasq-dns-c7cbb8f79-c4t4v\" (UID: \"e8fd5f7b-2a10-4593-9fed-b8bed2db29af\") " pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.177843 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhsvt\" (UniqueName: \"kubernetes.io/projected/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-kube-api-access-vhsvt\") pod \"dnsmasq-dns-c7cbb8f79-c4t4v\" (UID: \"e8fd5f7b-2a10-4593-9fed-b8bed2db29af\") " pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.177989 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-config\") pod \"dnsmasq-dns-c7cbb8f79-c4t4v\" (UID: \"e8fd5f7b-2a10-4593-9fed-b8bed2db29af\") " pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.178157 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-dns-svc\") pod \"dnsmasq-dns-c7cbb8f79-c4t4v\" (UID: \"e8fd5f7b-2a10-4593-9fed-b8bed2db29af\") " pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.179048 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-config\") pod \"dnsmasq-dns-c7cbb8f79-c4t4v\" (UID: \"e8fd5f7b-2a10-4593-9fed-b8bed2db29af\") " pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.181243 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-dns-svc\") pod \"dnsmasq-dns-c7cbb8f79-c4t4v\" (UID: \"e8fd5f7b-2a10-4593-9fed-b8bed2db29af\") " pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.219857 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhsvt\" (UniqueName: \"kubernetes.io/projected/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-kube-api-access-vhsvt\") pod \"dnsmasq-dns-c7cbb8f79-c4t4v\" (UID: \"e8fd5f7b-2a10-4593-9fed-b8bed2db29af\") " pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.253949 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-596p2"] Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.264033 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-sqvnn"] Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.266982 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.298080 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-sqvnn"] Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.380948 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-config\") pod \"dnsmasq-dns-95f5f6995-sqvnn\" (UID: \"131e0189-9ecf-4e0d-825d-f3cae5c83e5b\") " pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.381422 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmgzz\" (UniqueName: \"kubernetes.io/projected/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-kube-api-access-tmgzz\") pod \"dnsmasq-dns-95f5f6995-sqvnn\" (UID: \"131e0189-9ecf-4e0d-825d-f3cae5c83e5b\") " pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.381644 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-dns-svc\") pod \"dnsmasq-dns-95f5f6995-sqvnn\" (UID: \"131e0189-9ecf-4e0d-825d-f3cae5c83e5b\") " pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.393669 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.483933 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmgzz\" (UniqueName: \"kubernetes.io/projected/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-kube-api-access-tmgzz\") pod \"dnsmasq-dns-95f5f6995-sqvnn\" (UID: \"131e0189-9ecf-4e0d-825d-f3cae5c83e5b\") " pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.484325 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-dns-svc\") pod \"dnsmasq-dns-95f5f6995-sqvnn\" (UID: \"131e0189-9ecf-4e0d-825d-f3cae5c83e5b\") " pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.484416 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-config\") pod \"dnsmasq-dns-95f5f6995-sqvnn\" (UID: \"131e0189-9ecf-4e0d-825d-f3cae5c83e5b\") " pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.485314 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-config\") pod \"dnsmasq-dns-95f5f6995-sqvnn\" (UID: \"131e0189-9ecf-4e0d-825d-f3cae5c83e5b\") " pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.485824 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-dns-svc\") pod \"dnsmasq-dns-95f5f6995-sqvnn\" (UID: \"131e0189-9ecf-4e0d-825d-f3cae5c83e5b\") " pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.527963 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmgzz\" (UniqueName: \"kubernetes.io/projected/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-kube-api-access-tmgzz\") pod \"dnsmasq-dns-95f5f6995-sqvnn\" (UID: \"131e0189-9ecf-4e0d-825d-f3cae5c83e5b\") " pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.598788 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.950611 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.952221 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.955028 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-jpv2k" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.955219 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.955281 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.955410 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.955631 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.955686 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.955806 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 09 11:51:35 crc kubenswrapper[4745]: I1209 11:51:35.974358 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.103739 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.104265 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.104297 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.104325 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ceba626e-26d1-495f-b88d-fed69e445ddb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.104361 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ceba626e-26d1-495f-b88d-fed69e445ddb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.104392 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kts6\" (UniqueName: \"kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-kube-api-access-8kts6\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.104414 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.104476 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.104523 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.104553 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.104602 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.233807 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.233921 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.233963 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.233990 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.234011 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ceba626e-26d1-495f-b88d-fed69e445ddb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.234067 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ceba626e-26d1-495f-b88d-fed69e445ddb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.234131 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kts6\" (UniqueName: \"kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-kube-api-access-8kts6\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.234156 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.234300 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.234337 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.234387 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.237449 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.237632 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.237846 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.238755 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.238901 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.242413 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.246473 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.249254 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ceba626e-26d1-495f-b88d-fed69e445ddb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.252057 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.283647 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ceba626e-26d1-495f-b88d-fed69e445ddb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.293629 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kts6\" (UniqueName: \"kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-kube-api-access-8kts6\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.300834 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.303799 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-c4t4v"] Dec 09 11:51:36 crc kubenswrapper[4745]: W1209 11:51:36.318176 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8fd5f7b_2a10_4593_9fed_b8bed2db29af.slice/crio-915f9486fd41905da37cca774c60f94d4aaea59e252993e88398c9c3951ce719 WatchSource:0}: Error finding container 915f9486fd41905da37cca774c60f94d4aaea59e252993e88398c9c3951ce719: Status 404 returned error can't find the container with id 915f9486fd41905da37cca774c60f94d4aaea59e252993e88398c9c3951ce719 Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.350675 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.377771 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-sqvnn"] Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.412330 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.419012 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.419162 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.422057 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.422294 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.423985 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.425197 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.425946 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.426942 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.428060 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-cn9rk" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.552146 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grjnl\" (UniqueName: \"kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-kube-api-access-grjnl\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.552206 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b860955-30eb-40e6-bd56-caf6098aed8a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.552288 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.552335 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b860955-30eb-40e6-bd56-caf6098aed8a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.552413 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.552443 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.552484 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.552687 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.552729 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.552757 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-config-data\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.552779 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.658183 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.658584 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.658624 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-config-data\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.658648 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.658744 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grjnl\" (UniqueName: \"kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-kube-api-access-grjnl\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.658778 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b860955-30eb-40e6-bd56-caf6098aed8a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.658842 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.658879 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b860955-30eb-40e6-bd56-caf6098aed8a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.658912 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.658949 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.660096 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.660979 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.661883 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.662098 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.662197 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.662306 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.663265 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-config-data\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.677550 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b860955-30eb-40e6-bd56-caf6098aed8a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.679993 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.683872 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grjnl\" (UniqueName: \"kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-kube-api-access-grjnl\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.697252 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.713922 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b860955-30eb-40e6-bd56-caf6098aed8a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.716245 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.819875 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 11:51:36 crc kubenswrapper[4745]: I1209 11:51:36.989916 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 11:51:37 crc kubenswrapper[4745]: W1209 11:51:37.075298 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceba626e_26d1_495f_b88d_fed69e445ddb.slice/crio-e2bad3265f0201a0d7ec8975a55822f12482128226d12f4802d071a66515b01e WatchSource:0}: Error finding container e2bad3265f0201a0d7ec8975a55822f12482128226d12f4802d071a66515b01e: Status 404 returned error can't find the container with id e2bad3265f0201a0d7ec8975a55822f12482128226d12f4802d071a66515b01e Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.291374 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ceba626e-26d1-495f-b88d-fed69e445ddb","Type":"ContainerStarted","Data":"e2bad3265f0201a0d7ec8975a55822f12482128226d12f4802d071a66515b01e"} Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.298568 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" event={"ID":"131e0189-9ecf-4e0d-825d-f3cae5c83e5b","Type":"ContainerStarted","Data":"72487293aabeb1703391ca9d92ae2bf656ecbf4f877fe719ccd377d190292138"} Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.299832 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" event={"ID":"e8fd5f7b-2a10-4593-9fed-b8bed2db29af","Type":"ContainerStarted","Data":"915f9486fd41905da37cca774c60f94d4aaea59e252993e88398c9c3951ce719"} Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.717980 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.745907 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.748342 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.755760 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.756019 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.756148 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bq7z4" Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.756855 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.757167 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.763491 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.906775 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/94a3f188-f451-4895-b500-52a9f7877d00-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.906840 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-operator-scripts\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.906877 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt7dv\" (UniqueName: \"kubernetes.io/projected/94a3f188-f451-4895-b500-52a9f7877d00-kube-api-access-vt7dv\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.906909 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a3f188-f451-4895-b500-52a9f7877d00-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.907115 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/94a3f188-f451-4895-b500-52a9f7877d00-config-data-generated\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.907164 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-kolla-config\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.907252 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:37 crc kubenswrapper[4745]: I1209 11:51:37.907274 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-config-data-default\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.008866 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/94a3f188-f451-4895-b500-52a9f7877d00-config-data-generated\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.008908 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-kolla-config\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.008942 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.008958 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-config-data-default\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.009002 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/94a3f188-f451-4895-b500-52a9f7877d00-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.009031 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-operator-scripts\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.009055 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt7dv\" (UniqueName: \"kubernetes.io/projected/94a3f188-f451-4895-b500-52a9f7877d00-kube-api-access-vt7dv\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.009077 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a3f188-f451-4895-b500-52a9f7877d00-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.010538 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-config-data-default\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.010775 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/94a3f188-f451-4895-b500-52a9f7877d00-config-data-generated\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.010885 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.011713 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-kolla-config\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.013218 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-operator-scripts\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.021891 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/94a3f188-f451-4895-b500-52a9f7877d00-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.051139 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt7dv\" (UniqueName: \"kubernetes.io/projected/94a3f188-f451-4895-b500-52a9f7877d00-kube-api-access-vt7dv\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.091978 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a3f188-f451-4895-b500-52a9f7877d00-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.147461 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " pod="openstack/openstack-galera-0" Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.318872 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b860955-30eb-40e6-bd56-caf6098aed8a","Type":"ContainerStarted","Data":"f576cfeddfbfe71611a5595f13c1d27aeffdd4d65e66d45ef4296a0cf7610284"} Dec 09 11:51:38 crc kubenswrapper[4745]: I1209 11:51:38.428905 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.409564 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.414307 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.422862 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-dglcr" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.423064 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.423208 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.423898 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.428167 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.647693 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a88ab6-2793-4952-b04c-9041a15e83f9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.647765 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.647862 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.647905 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.647929 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/70a88ab6-2793-4952-b04c-9041a15e83f9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.647996 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdtsx\" (UniqueName: \"kubernetes.io/projected/70a88ab6-2793-4952-b04c-9041a15e83f9-kube-api-access-fdtsx\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.648049 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.648114 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/70a88ab6-2793-4952-b04c-9041a15e83f9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.755002 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.755045 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.755067 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/70a88ab6-2793-4952-b04c-9041a15e83f9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.755110 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdtsx\" (UniqueName: \"kubernetes.io/projected/70a88ab6-2793-4952-b04c-9041a15e83f9-kube-api-access-fdtsx\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.756312 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.756968 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.757116 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/70a88ab6-2793-4952-b04c-9041a15e83f9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.757185 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a88ab6-2793-4952-b04c-9041a15e83f9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.757220 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.758217 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.760541 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.760883 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/70a88ab6-2793-4952-b04c-9041a15e83f9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.761847 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.774102 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a88ab6-2793-4952-b04c-9041a15e83f9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:39 crc kubenswrapper[4745]: I1209 11:51:39.787711 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdtsx\" (UniqueName: \"kubernetes.io/projected/70a88ab6-2793-4952-b04c-9041a15e83f9-kube-api-access-fdtsx\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:39.875674 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:39.877046 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:39.986362 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ab78e7-7419-4892-92c0-085db552be56-combined-ca-bundle\") pod \"memcached-0\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:39.986431 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn9w4\" (UniqueName: \"kubernetes.io/projected/84ab78e7-7419-4892-92c0-085db552be56-kube-api-access-xn9w4\") pod \"memcached-0\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:39.986461 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84ab78e7-7419-4892-92c0-085db552be56-config-data\") pod \"memcached-0\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:39.986876 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84ab78e7-7419-4892-92c0-085db552be56-kolla-config\") pod \"memcached-0\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:39.987336 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ab78e7-7419-4892-92c0-085db552be56-memcached-tls-certs\") pod \"memcached-0\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:39.987744 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-nwr5f" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:39.988001 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:39.988020 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:39.990086 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:40.003744 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/70a88ab6-2793-4952-b04c-9041a15e83f9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:40.064590 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:40.080082 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:40.088170 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84ab78e7-7419-4892-92c0-085db552be56-config-data\") pod \"memcached-0\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:40.088216 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn9w4\" (UniqueName: \"kubernetes.io/projected/84ab78e7-7419-4892-92c0-085db552be56-kube-api-access-xn9w4\") pod \"memcached-0\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:40.088287 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84ab78e7-7419-4892-92c0-085db552be56-kolla-config\") pod \"memcached-0\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:40.088352 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ab78e7-7419-4892-92c0-085db552be56-memcached-tls-certs\") pod \"memcached-0\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:40.088382 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ab78e7-7419-4892-92c0-085db552be56-combined-ca-bundle\") pod \"memcached-0\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:40.116072 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ab78e7-7419-4892-92c0-085db552be56-combined-ca-bundle\") pod \"memcached-0\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:40.116953 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84ab78e7-7419-4892-92c0-085db552be56-config-data\") pod \"memcached-0\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:40.117928 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84ab78e7-7419-4892-92c0-085db552be56-kolla-config\") pod \"memcached-0\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:40.300379 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ab78e7-7419-4892-92c0-085db552be56-memcached-tls-certs\") pod \"memcached-0\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:40.328899 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn9w4\" (UniqueName: \"kubernetes.io/projected/84ab78e7-7419-4892-92c0-085db552be56-kube-api-access-xn9w4\") pod \"memcached-0\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:40.625437 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 09 11:51:40 crc kubenswrapper[4745]: I1209 11:51:40.674905 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 11:51:40 crc kubenswrapper[4745]: W1209 11:51:40.793566 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94a3f188_f451_4895_b500_52a9f7877d00.slice/crio-24972443a6308fdfde1ad75c5910db1a23c0646187375bf6dc56b9be1cd702f9 WatchSource:0}: Error finding container 24972443a6308fdfde1ad75c5910db1a23c0646187375bf6dc56b9be1cd702f9: Status 404 returned error can't find the container with id 24972443a6308fdfde1ad75c5910db1a23c0646187375bf6dc56b9be1cd702f9 Dec 09 11:51:41 crc kubenswrapper[4745]: I1209 11:51:41.017479 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 11:51:41 crc kubenswrapper[4745]: W1209 11:51:41.032678 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70a88ab6_2793_4952_b04c_9041a15e83f9.slice/crio-aaa8b3251637529b75cae7d64799a3625bb55130e930c3ed9d03811da913249e WatchSource:0}: Error finding container aaa8b3251637529b75cae7d64799a3625bb55130e930c3ed9d03811da913249e: Status 404 returned error can't find the container with id aaa8b3251637529b75cae7d64799a3625bb55130e930c3ed9d03811da913249e Dec 09 11:51:41 crc kubenswrapper[4745]: I1209 11:51:41.472748 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 09 11:51:41 crc kubenswrapper[4745]: I1209 11:51:41.474835 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"70a88ab6-2793-4952-b04c-9041a15e83f9","Type":"ContainerStarted","Data":"aaa8b3251637529b75cae7d64799a3625bb55130e930c3ed9d03811da913249e"} Dec 09 11:51:41 crc kubenswrapper[4745]: I1209 11:51:41.489598 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"94a3f188-f451-4895-b500-52a9f7877d00","Type":"ContainerStarted","Data":"24972443a6308fdfde1ad75c5910db1a23c0646187375bf6dc56b9be1cd702f9"} Dec 09 11:51:41 crc kubenswrapper[4745]: I1209 11:51:41.765598 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 11:51:41 crc kubenswrapper[4745]: I1209 11:51:41.767280 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 11:51:41 crc kubenswrapper[4745]: I1209 11:51:41.770990 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-cqxkb" Dec 09 11:51:41 crc kubenswrapper[4745]: I1209 11:51:41.779606 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 11:51:41 crc kubenswrapper[4745]: I1209 11:51:41.880885 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwg9k\" (UniqueName: \"kubernetes.io/projected/d05e5297-a218-42d2-b46a-5c72201d96b4-kube-api-access-mwg9k\") pod \"kube-state-metrics-0\" (UID: \"d05e5297-a218-42d2-b46a-5c72201d96b4\") " pod="openstack/kube-state-metrics-0" Dec 09 11:51:41 crc kubenswrapper[4745]: I1209 11:51:41.982569 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwg9k\" (UniqueName: \"kubernetes.io/projected/d05e5297-a218-42d2-b46a-5c72201d96b4-kube-api-access-mwg9k\") pod \"kube-state-metrics-0\" (UID: \"d05e5297-a218-42d2-b46a-5c72201d96b4\") " pod="openstack/kube-state-metrics-0" Dec 09 11:51:42 crc kubenswrapper[4745]: I1209 11:51:42.011014 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwg9k\" (UniqueName: \"kubernetes.io/projected/d05e5297-a218-42d2-b46a-5c72201d96b4-kube-api-access-mwg9k\") pod \"kube-state-metrics-0\" (UID: \"d05e5297-a218-42d2-b46a-5c72201d96b4\") " pod="openstack/kube-state-metrics-0" Dec 09 11:51:42 crc kubenswrapper[4745]: I1209 11:51:42.101585 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 11:51:42 crc kubenswrapper[4745]: I1209 11:51:42.522547 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"84ab78e7-7419-4892-92c0-085db552be56","Type":"ContainerStarted","Data":"7040f68d397bde83ed93252b05cd7771f9c0d79e8883478a2cf4d910f8f5a280"} Dec 09 11:51:42 crc kubenswrapper[4745]: I1209 11:51:42.660485 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 11:51:43 crc kubenswrapper[4745]: I1209 11:51:43.810205 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d05e5297-a218-42d2-b46a-5c72201d96b4","Type":"ContainerStarted","Data":"065868c04fb3fb546e76bc2a69a2712f68a0b508915e6543cf4467e3a1d745df"} Dec 09 11:51:45 crc kubenswrapper[4745]: I1209 11:51:45.885955 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dqr9b"] Dec 09 11:51:45 crc kubenswrapper[4745]: I1209 11:51:45.903185 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:45 crc kubenswrapper[4745]: I1209 11:51:45.913732 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-log-ovn\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:45 crc kubenswrapper[4745]: I1209 11:51:45.916933 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-scripts\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:45 crc kubenswrapper[4745]: I1209 11:51:45.917091 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-run\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:45 crc kubenswrapper[4745]: I1209 11:51:45.918120 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 09 11:51:45 crc kubenswrapper[4745]: I1209 11:51:45.918420 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-z5265" Dec 09 11:51:45 crc kubenswrapper[4745]: I1209 11:51:45.918579 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 09 11:51:45 crc kubenswrapper[4745]: I1209 11:51:45.918925 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dqr9b"] Dec 09 11:51:45 crc kubenswrapper[4745]: I1209 11:51:45.919222 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-ovn-controller-tls-certs\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:45 crc kubenswrapper[4745]: I1209 11:51:45.919812 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-run-ovn\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:45 crc kubenswrapper[4745]: I1209 11:51:45.920044 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-combined-ca-bundle\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:45 crc kubenswrapper[4745]: I1209 11:51:45.920189 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrdwg\" (UniqueName: \"kubernetes.io/projected/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-kube-api-access-mrdwg\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:45 crc kubenswrapper[4745]: I1209 11:51:45.936836 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-zdn2w"] Dec 09 11:51:45 crc kubenswrapper[4745]: I1209 11:51:45.938564 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:45 crc kubenswrapper[4745]: I1209 11:51:45.946123 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zdn2w"] Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.051153 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2qfq\" (UniqueName: \"kubernetes.io/projected/c38b2a61-5161-4132-be1a-65e25531e73a-kube-api-access-j2qfq\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.051368 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-run\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.051606 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-log-ovn\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.051746 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-scripts\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.051776 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c38b2a61-5161-4132-be1a-65e25531e73a-scripts\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.051828 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-run\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.051857 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-log\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.051919 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-ovn-controller-tls-certs\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.051982 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-lib\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.052040 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-run-ovn\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.052077 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-combined-ca-bundle\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.052118 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrdwg\" (UniqueName: \"kubernetes.io/projected/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-kube-api-access-mrdwg\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.052150 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-etc-ovs\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.052149 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-log-ovn\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.053736 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-run\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.053936 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-run-ovn\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.059160 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-scripts\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.070629 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-ovn-controller-tls-certs\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.077700 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-combined-ca-bundle\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.084577 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrdwg\" (UniqueName: \"kubernetes.io/projected/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-kube-api-access-mrdwg\") pod \"ovn-controller-dqr9b\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.162724 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c38b2a61-5161-4132-be1a-65e25531e73a-scripts\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.162775 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-log\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.162805 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-lib\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.162844 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-etc-ovs\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.162882 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2qfq\" (UniqueName: \"kubernetes.io/projected/c38b2a61-5161-4132-be1a-65e25531e73a-kube-api-access-j2qfq\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.162924 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-run\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.163140 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-run\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.163698 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-lib\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.164591 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-etc-ovs\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.165782 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c38b2a61-5161-4132-be1a-65e25531e73a-scripts\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.166587 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-log\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.195019 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2qfq\" (UniqueName: \"kubernetes.io/projected/c38b2a61-5161-4132-be1a-65e25531e73a-kube-api-access-j2qfq\") pod \"ovn-controller-ovs-zdn2w\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.278992 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dqr9b" Dec 09 11:51:46 crc kubenswrapper[4745]: I1209 11:51:46.289590 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.702737 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.705137 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.717027 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-7l5bd" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.717386 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.717398 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.717415 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.717631 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.734814 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.828035 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.828359 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.828390 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d83c2d24-3c0c-4097-afd8-1649e08665e4-config\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.828413 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.828437 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.828477 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d83c2d24-3c0c-4097-afd8-1649e08665e4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.828504 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d83c2d24-3c0c-4097-afd8-1649e08665e4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.828542 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srvvj\" (UniqueName: \"kubernetes.io/projected/d83c2d24-3c0c-4097-afd8-1649e08665e4-kube-api-access-srvvj\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.930686 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.930737 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.930770 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d83c2d24-3c0c-4097-afd8-1649e08665e4-config\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.930788 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.930809 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.930832 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d83c2d24-3c0c-4097-afd8-1649e08665e4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.930857 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d83c2d24-3c0c-4097-afd8-1649e08665e4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.930875 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srvvj\" (UniqueName: \"kubernetes.io/projected/d83c2d24-3c0c-4097-afd8-1649e08665e4-kube-api-access-srvvj\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.931123 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.931417 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d83c2d24-3c0c-4097-afd8-1649e08665e4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.931962 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d83c2d24-3c0c-4097-afd8-1649e08665e4-config\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.938324 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d83c2d24-3c0c-4097-afd8-1649e08665e4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.943544 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.943846 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.947435 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.958468 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:47 crc kubenswrapper[4745]: I1209 11:51:47.962593 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srvvj\" (UniqueName: \"kubernetes.io/projected/d83c2d24-3c0c-4097-afd8-1649e08665e4-kube-api-access-srvvj\") pod \"ovsdbserver-nb-0\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.042791 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.686446 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.688579 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.698612 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.699023 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.699122 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.699459 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-glggs" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.714098 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.765162 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.765308 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e18fbbd-a304-422e-8c13-88ab08fef424-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.765343 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.765416 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e18fbbd-a304-422e-8c13-88ab08fef424-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.765472 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e18fbbd-a304-422e-8c13-88ab08fef424-config\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.765596 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.765666 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.765703 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlrjl\" (UniqueName: \"kubernetes.io/projected/8e18fbbd-a304-422e-8c13-88ab08fef424-kube-api-access-nlrjl\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.888701 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.888798 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.888822 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e18fbbd-a304-422e-8c13-88ab08fef424-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.888841 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e18fbbd-a304-422e-8c13-88ab08fef424-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.888863 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e18fbbd-a304-422e-8c13-88ab08fef424-config\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.888893 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.888925 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.888951 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlrjl\" (UniqueName: \"kubernetes.io/projected/8e18fbbd-a304-422e-8c13-88ab08fef424-kube-api-access-nlrjl\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.889707 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e18fbbd-a304-422e-8c13-88ab08fef424-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.890486 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e18fbbd-a304-422e-8c13-88ab08fef424-config\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.890614 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.922179 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.929664 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.939173 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.939613 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:48 crc kubenswrapper[4745]: I1209 11:51:48.951000 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlrjl\" (UniqueName: \"kubernetes.io/projected/8e18fbbd-a304-422e-8c13-88ab08fef424-kube-api-access-nlrjl\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:49 crc kubenswrapper[4745]: I1209 11:51:49.492877 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e18fbbd-a304-422e-8c13-88ab08fef424-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " pod="openstack/ovsdbserver-sb-0" Dec 09 11:51:49 crc kubenswrapper[4745]: I1209 11:51:49.649079 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 09 11:52:07 crc kubenswrapper[4745]: E1209 11:52:07.493180 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13" Dec 09 11:52:07 crc kubenswrapper[4745]: E1209 11:52:07.494791 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdtsx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(70a88ab6-2793-4952-b04c-9041a15e83f9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:52:07 crc kubenswrapper[4745]: E1209 11:52:07.496948 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="70a88ab6-2793-4952-b04c-9041a15e83f9" Dec 09 11:52:08 crc kubenswrapper[4745]: E1209 11:52:08.103305 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="70a88ab6-2793-4952-b04c-9041a15e83f9" Dec 09 11:52:08 crc kubenswrapper[4745]: E1209 11:52:08.573356 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d" Dec 09 11:52:08 crc kubenswrapper[4745]: E1209 11:52:08.573633 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8kts6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(ceba626e-26d1-495f-b88d-fed69e445ddb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:52:08 crc kubenswrapper[4745]: E1209 11:52:08.575320 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="ceba626e-26d1-495f-b88d-fed69e445ddb" Dec 09 11:52:08 crc kubenswrapper[4745]: E1209 11:52:08.608861 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d" Dec 09 11:52:08 crc kubenswrapper[4745]: E1209 11:52:08.609205 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grjnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(5b860955-30eb-40e6-bd56-caf6098aed8a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:52:08 crc kubenswrapper[4745]: E1209 11:52:08.611414 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="5b860955-30eb-40e6-bd56-caf6098aed8a" Dec 09 11:52:09 crc kubenswrapper[4745]: E1209 11:52:09.112050 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d\\\"\"" pod="openstack/rabbitmq-server-0" podUID="5b860955-30eb-40e6-bd56-caf6098aed8a" Dec 09 11:52:09 crc kubenswrapper[4745]: E1209 11:52:09.112417 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="ceba626e-26d1-495f-b88d-fed69e445ddb" Dec 09 11:52:12 crc kubenswrapper[4745]: E1209 11:52:12.903529 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13" Dec 09 11:52:12 crc kubenswrapper[4745]: E1209 11:52:12.905375 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vt7dv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(94a3f188-f451-4895-b500-52a9f7877d00): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:52:12 crc kubenswrapper[4745]: E1209 11:52:12.907702 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="94a3f188-f451-4895-b500-52a9f7877d00" Dec 09 11:52:13 crc kubenswrapper[4745]: E1209 11:52:13.159381 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13\\\"\"" pod="openstack/openstack-galera-0" podUID="94a3f188-f451-4895-b500-52a9f7877d00" Dec 09 11:52:13 crc kubenswrapper[4745]: E1209 11:52:13.603484 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Dec 09 11:52:13 crc kubenswrapper[4745]: E1209 11:52:13.604049 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4xcxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-84bb9d8bd9-596p2_openstack(d3eaea34-0343-402e-a370-0eaf917587d0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:52:13 crc kubenswrapper[4745]: E1209 11:52:13.605297 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-84bb9d8bd9-596p2" podUID="d3eaea34-0343-402e-a370-0eaf917587d0" Dec 09 11:52:13 crc kubenswrapper[4745]: E1209 11:52:13.628441 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Dec 09 11:52:13 crc kubenswrapper[4745]: E1209 11:52:13.628619 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhsvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-c7cbb8f79-c4t4v_openstack(e8fd5f7b-2a10-4593-9fed-b8bed2db29af): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:52:13 crc kubenswrapper[4745]: E1209 11:52:13.629835 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" podUID="e8fd5f7b-2a10-4593-9fed-b8bed2db29af" Dec 09 11:52:13 crc kubenswrapper[4745]: E1209 11:52:13.645299 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Dec 09 11:52:13 crc kubenswrapper[4745]: E1209 11:52:13.645599 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmgzz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-95f5f6995-sqvnn_openstack(131e0189-9ecf-4e0d-825d-f3cae5c83e5b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:52:13 crc kubenswrapper[4745]: E1209 11:52:13.646715 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" podUID="131e0189-9ecf-4e0d-825d-f3cae5c83e5b" Dec 09 11:52:13 crc kubenswrapper[4745]: E1209 11:52:13.826066 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Dec 09 11:52:13 crc kubenswrapper[4745]: E1209 11:52:13.826285 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f7gb9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5f854695bc-zplcs_openstack(1bc4dea0-b1c3-4740-9f46-9255fae9b69b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:52:13 crc kubenswrapper[4745]: E1209 11:52:13.827443 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5f854695bc-zplcs" podUID="1bc4dea0-b1c3-4740-9f46-9255fae9b69b" Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.067410 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.159550 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 11:52:14 crc kubenswrapper[4745]: E1209 11:52:14.168429 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33\\\"\"" pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" podUID="e8fd5f7b-2a10-4593-9fed-b8bed2db29af" Dec 09 11:52:14 crc kubenswrapper[4745]: E1209 11:52:14.168499 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33\\\"\"" pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" podUID="131e0189-9ecf-4e0d-825d-f3cae5c83e5b" Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.233445 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dqr9b"] Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.308532 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zdn2w"] Dec 09 11:52:14 crc kubenswrapper[4745]: W1209 11:52:14.473527 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcea58c8_1e21_4581_bcdb_7b1f88e8b463.slice/crio-a0808d7f62ef2d56de18068d7de9d014400595e957f3924e71ce402e04b08c06 WatchSource:0}: Error finding container a0808d7f62ef2d56de18068d7de9d014400595e957f3924e71ce402e04b08c06: Status 404 returned error can't find the container with id a0808d7f62ef2d56de18068d7de9d014400595e957f3924e71ce402e04b08c06 Dec 09 11:52:14 crc kubenswrapper[4745]: W1209 11:52:14.475083 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc38b2a61_5161_4132_be1a_65e25531e73a.slice/crio-b2913bd328e4cb8f4404382e672f30b5f034338f9527b4a46c11c6a720e37e7e WatchSource:0}: Error finding container b2913bd328e4cb8f4404382e672f30b5f034338f9527b4a46c11c6a720e37e7e: Status 404 returned error can't find the container with id b2913bd328e4cb8f4404382e672f30b5f034338f9527b4a46c11c6a720e37e7e Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.593001 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-zplcs" Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.602367 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-596p2" Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.652627 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-config\") pod \"1bc4dea0-b1c3-4740-9f46-9255fae9b69b\" (UID: \"1bc4dea0-b1c3-4740-9f46-9255fae9b69b\") " Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.652796 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xcxk\" (UniqueName: \"kubernetes.io/projected/d3eaea34-0343-402e-a370-0eaf917587d0-kube-api-access-4xcxk\") pod \"d3eaea34-0343-402e-a370-0eaf917587d0\" (UID: \"d3eaea34-0343-402e-a370-0eaf917587d0\") " Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.652831 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3eaea34-0343-402e-a370-0eaf917587d0-config\") pod \"d3eaea34-0343-402e-a370-0eaf917587d0\" (UID: \"d3eaea34-0343-402e-a370-0eaf917587d0\") " Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.652993 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7gb9\" (UniqueName: \"kubernetes.io/projected/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-kube-api-access-f7gb9\") pod \"1bc4dea0-b1c3-4740-9f46-9255fae9b69b\" (UID: \"1bc4dea0-b1c3-4740-9f46-9255fae9b69b\") " Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.653117 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-dns-svc\") pod \"1bc4dea0-b1c3-4740-9f46-9255fae9b69b\" (UID: \"1bc4dea0-b1c3-4740-9f46-9255fae9b69b\") " Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.653190 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-config" (OuterVolumeSpecName: "config") pod "1bc4dea0-b1c3-4740-9f46-9255fae9b69b" (UID: "1bc4dea0-b1c3-4740-9f46-9255fae9b69b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.653653 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.653691 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3eaea34-0343-402e-a370-0eaf917587d0-config" (OuterVolumeSpecName: "config") pod "d3eaea34-0343-402e-a370-0eaf917587d0" (UID: "d3eaea34-0343-402e-a370-0eaf917587d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.654245 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1bc4dea0-b1c3-4740-9f46-9255fae9b69b" (UID: "1bc4dea0-b1c3-4740-9f46-9255fae9b69b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.659922 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3eaea34-0343-402e-a370-0eaf917587d0-kube-api-access-4xcxk" (OuterVolumeSpecName: "kube-api-access-4xcxk") pod "d3eaea34-0343-402e-a370-0eaf917587d0" (UID: "d3eaea34-0343-402e-a370-0eaf917587d0"). InnerVolumeSpecName "kube-api-access-4xcxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.659998 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-kube-api-access-f7gb9" (OuterVolumeSpecName: "kube-api-access-f7gb9") pod "1bc4dea0-b1c3-4740-9f46-9255fae9b69b" (UID: "1bc4dea0-b1c3-4740-9f46-9255fae9b69b"). InnerVolumeSpecName "kube-api-access-f7gb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.755737 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7gb9\" (UniqueName: \"kubernetes.io/projected/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-kube-api-access-f7gb9\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.755786 4745 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bc4dea0-b1c3-4740-9f46-9255fae9b69b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.755800 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xcxk\" (UniqueName: \"kubernetes.io/projected/d3eaea34-0343-402e-a370-0eaf917587d0-kube-api-access-4xcxk\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:14 crc kubenswrapper[4745]: I1209 11:52:14.755814 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3eaea34-0343-402e-a370-0eaf917587d0-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.186810 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d83c2d24-3c0c-4097-afd8-1649e08665e4","Type":"ContainerStarted","Data":"9c5f9496854fd26bc519c9bec4075b075f278b18226b402ed3b9fccc9d940e12"} Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.189600 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8e18fbbd-a304-422e-8c13-88ab08fef424","Type":"ContainerStarted","Data":"13acead9b2f70206b0d4d9acc4b65b11ce773376a33a579d14ee567690763687"} Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.191073 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-596p2" event={"ID":"d3eaea34-0343-402e-a370-0eaf917587d0","Type":"ContainerDied","Data":"4cfd28671f439e47ae8212f8c4c3d1426f619f34be5525cc1ac1cd23d3430b86"} Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.191098 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-596p2" Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.192551 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dqr9b" event={"ID":"dcea58c8-1e21-4581-bcdb-7b1f88e8b463","Type":"ContainerStarted","Data":"a0808d7f62ef2d56de18068d7de9d014400595e957f3924e71ce402e04b08c06"} Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.194112 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zdn2w" event={"ID":"c38b2a61-5161-4132-be1a-65e25531e73a","Type":"ContainerStarted","Data":"b2913bd328e4cb8f4404382e672f30b5f034338f9527b4a46c11c6a720e37e7e"} Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.196441 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-zplcs" Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.196426 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-zplcs" event={"ID":"1bc4dea0-b1c3-4740-9f46-9255fae9b69b","Type":"ContainerDied","Data":"44e0c8d152f788909b95d0ac2bca5a66800ad098a75e47f94e27134aaa526437"} Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.199277 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"84ab78e7-7419-4892-92c0-085db552be56","Type":"ContainerStarted","Data":"165fc14fd97a535329fc700bc5ee13ff68c82e069d90770b464ef7ed62a40419"} Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.200332 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.229568 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=4.04827539 podStartE2EDuration="36.229546392s" podCreationTimestamp="2025-12-09 11:51:39 +0000 UTC" firstStartedPulling="2025-12-09 11:51:41.504778 +0000 UTC m=+1188.329979524" lastFinishedPulling="2025-12-09 11:52:13.686049002 +0000 UTC m=+1220.511250526" observedRunningTime="2025-12-09 11:52:15.221478775 +0000 UTC m=+1222.046680299" watchObservedRunningTime="2025-12-09 11:52:15.229546392 +0000 UTC m=+1222.054747916" Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.271036 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-zplcs"] Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.295666 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-zplcs"] Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.320328 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-596p2"] Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.343933 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-596p2"] Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.569323 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bc4dea0-b1c3-4740-9f46-9255fae9b69b" path="/var/lib/kubelet/pods/1bc4dea0-b1c3-4740-9f46-9255fae9b69b/volumes" Dec 09 11:52:15 crc kubenswrapper[4745]: I1209 11:52:15.570354 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3eaea34-0343-402e-a370-0eaf917587d0" path="/var/lib/kubelet/pods/d3eaea34-0343-402e-a370-0eaf917587d0/volumes" Dec 09 11:52:16 crc kubenswrapper[4745]: I1209 11:52:16.216347 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d05e5297-a218-42d2-b46a-5c72201d96b4","Type":"ContainerStarted","Data":"523941c45b4eb75f71913c34b201f37a3dbf8c9e8afb7b7136a440e67eeffc4a"} Dec 09 11:52:16 crc kubenswrapper[4745]: I1209 11:52:16.216422 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 09 11:52:16 crc kubenswrapper[4745]: I1209 11:52:16.245731 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.205057782 podStartE2EDuration="35.245702674s" podCreationTimestamp="2025-12-09 11:51:41 +0000 UTC" firstStartedPulling="2025-12-09 11:51:42.706147518 +0000 UTC m=+1189.531349042" lastFinishedPulling="2025-12-09 11:52:15.74679241 +0000 UTC m=+1222.571993934" observedRunningTime="2025-12-09 11:52:16.236533038 +0000 UTC m=+1223.061734562" watchObservedRunningTime="2025-12-09 11:52:16.245702674 +0000 UTC m=+1223.070904198" Dec 09 11:52:19 crc kubenswrapper[4745]: I1209 11:52:19.246354 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8e18fbbd-a304-422e-8c13-88ab08fef424","Type":"ContainerStarted","Data":"40aa5562bd3f9aae5ac913d05ca874ce263e3c959881973ff50135ec376a86f0"} Dec 09 11:52:19 crc kubenswrapper[4745]: I1209 11:52:19.248666 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dqr9b" event={"ID":"dcea58c8-1e21-4581-bcdb-7b1f88e8b463","Type":"ContainerStarted","Data":"fe72087dc101255fbdaec3e160e931ac8185f38e75fad2f047784e8531b33949"} Dec 09 11:52:19 crc kubenswrapper[4745]: I1209 11:52:19.249241 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-dqr9b" Dec 09 11:52:19 crc kubenswrapper[4745]: I1209 11:52:19.251610 4745 generic.go:334] "Generic (PLEG): container finished" podID="c38b2a61-5161-4132-be1a-65e25531e73a" containerID="a71d23fc91157e8ffa3a582e3c253443fba5f09c072196181f1042c72a332661" exitCode=0 Dec 09 11:52:19 crc kubenswrapper[4745]: I1209 11:52:19.251672 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zdn2w" event={"ID":"c38b2a61-5161-4132-be1a-65e25531e73a","Type":"ContainerDied","Data":"a71d23fc91157e8ffa3a582e3c253443fba5f09c072196181f1042c72a332661"} Dec 09 11:52:19 crc kubenswrapper[4745]: I1209 11:52:19.254636 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d83c2d24-3c0c-4097-afd8-1649e08665e4","Type":"ContainerStarted","Data":"bff23da6e6763af0c2998317017091c2a8ad2169bcc47bd632c82d8992462e86"} Dec 09 11:52:19 crc kubenswrapper[4745]: I1209 11:52:19.275572 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dqr9b" podStartSLOduration=30.744790797 podStartE2EDuration="34.275550878s" podCreationTimestamp="2025-12-09 11:51:45 +0000 UTC" firstStartedPulling="2025-12-09 11:52:14.478266933 +0000 UTC m=+1221.303468457" lastFinishedPulling="2025-12-09 11:52:18.009026984 +0000 UTC m=+1224.834228538" observedRunningTime="2025-12-09 11:52:19.268124878 +0000 UTC m=+1226.093326402" watchObservedRunningTime="2025-12-09 11:52:19.275550878 +0000 UTC m=+1226.100752402" Dec 09 11:52:20 crc kubenswrapper[4745]: I1209 11:52:20.267155 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zdn2w" event={"ID":"c38b2a61-5161-4132-be1a-65e25531e73a","Type":"ContainerStarted","Data":"d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390"} Dec 09 11:52:20 crc kubenswrapper[4745]: I1209 11:52:20.268131 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zdn2w" event={"ID":"c38b2a61-5161-4132-be1a-65e25531e73a","Type":"ContainerStarted","Data":"b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253"} Dec 09 11:52:20 crc kubenswrapper[4745]: I1209 11:52:20.269057 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:52:20 crc kubenswrapper[4745]: I1209 11:52:20.295917 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-zdn2w" podStartSLOduration=31.791193353 podStartE2EDuration="35.295895243s" podCreationTimestamp="2025-12-09 11:51:45 +0000 UTC" firstStartedPulling="2025-12-09 11:52:14.479012703 +0000 UTC m=+1221.304214227" lastFinishedPulling="2025-12-09 11:52:17.983714593 +0000 UTC m=+1224.808916117" observedRunningTime="2025-12-09 11:52:20.292373058 +0000 UTC m=+1227.117574592" watchObservedRunningTime="2025-12-09 11:52:20.295895243 +0000 UTC m=+1227.121096767" Dec 09 11:52:20 crc kubenswrapper[4745]: I1209 11:52:20.627603 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 09 11:52:21 crc kubenswrapper[4745]: I1209 11:52:21.278441 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.115526 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.160959 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-c4t4v"] Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.218042 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f9f9f545f-sp6l7"] Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.221487 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.237487 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f9f9f545f-sp6l7"] Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.292353 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d83c2d24-3c0c-4097-afd8-1649e08665e4","Type":"ContainerStarted","Data":"82e76a7787b8495afefaced3bf46a59645fcc2228c694a5e7e6a738e92d9f044"} Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.302238 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8e18fbbd-a304-422e-8c13-88ab08fef424","Type":"ContainerStarted","Data":"f5ff7ad1034757488e9fa6ef3e4534c7b33ad6d4824c61d216c9946583e12257"} Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.306836 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-config\") pod \"dnsmasq-dns-7f9f9f545f-sp6l7\" (UID: \"d0ec42f1-88e6-4104-9272-eaf4f4ae7326\") " pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.307096 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgp7n\" (UniqueName: \"kubernetes.io/projected/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-kube-api-access-cgp7n\") pod \"dnsmasq-dns-7f9f9f545f-sp6l7\" (UID: \"d0ec42f1-88e6-4104-9272-eaf4f4ae7326\") " pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.307347 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-dns-svc\") pod \"dnsmasq-dns-7f9f9f545f-sp6l7\" (UID: \"d0ec42f1-88e6-4104-9272-eaf4f4ae7326\") " pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.306966 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"70a88ab6-2793-4952-b04c-9041a15e83f9","Type":"ContainerStarted","Data":"f2eada94a0c5b273d35a92a2faa4601639f853c423ecc3e3abc9140ff66f94e3"} Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.329171 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=29.029846825 podStartE2EDuration="36.329147611s" podCreationTimestamp="2025-12-09 11:51:46 +0000 UTC" firstStartedPulling="2025-12-09 11:52:14.345297232 +0000 UTC m=+1221.170498756" lastFinishedPulling="2025-12-09 11:52:21.644598018 +0000 UTC m=+1228.469799542" observedRunningTime="2025-12-09 11:52:22.318217686 +0000 UTC m=+1229.143419210" watchObservedRunningTime="2025-12-09 11:52:22.329147611 +0000 UTC m=+1229.154349135" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.409995 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-dns-svc\") pod \"dnsmasq-dns-7f9f9f545f-sp6l7\" (UID: \"d0ec42f1-88e6-4104-9272-eaf4f4ae7326\") " pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.411623 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-config\") pod \"dnsmasq-dns-7f9f9f545f-sp6l7\" (UID: \"d0ec42f1-88e6-4104-9272-eaf4f4ae7326\") " pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.411661 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgp7n\" (UniqueName: \"kubernetes.io/projected/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-kube-api-access-cgp7n\") pod \"dnsmasq-dns-7f9f9f545f-sp6l7\" (UID: \"d0ec42f1-88e6-4104-9272-eaf4f4ae7326\") " pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.413257 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-config\") pod \"dnsmasq-dns-7f9f9f545f-sp6l7\" (UID: \"d0ec42f1-88e6-4104-9272-eaf4f4ae7326\") " pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.413808 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-dns-svc\") pod \"dnsmasq-dns-7f9f9f545f-sp6l7\" (UID: \"d0ec42f1-88e6-4104-9272-eaf4f4ae7326\") " pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.423551 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=28.090808997 podStartE2EDuration="35.423537202s" podCreationTimestamp="2025-12-09 11:51:47 +0000 UTC" firstStartedPulling="2025-12-09 11:52:14.347232425 +0000 UTC m=+1221.172433949" lastFinishedPulling="2025-12-09 11:52:21.67996063 +0000 UTC m=+1228.505162154" observedRunningTime="2025-12-09 11:52:22.414777876 +0000 UTC m=+1229.239979410" watchObservedRunningTime="2025-12-09 11:52:22.423537202 +0000 UTC m=+1229.248738726" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.433010 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgp7n\" (UniqueName: \"kubernetes.io/projected/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-kube-api-access-cgp7n\") pod \"dnsmasq-dns-7f9f9f545f-sp6l7\" (UID: \"d0ec42f1-88e6-4104-9272-eaf4f4ae7326\") " pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.555870 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.649989 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.718680 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.735777 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.834705 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhsvt\" (UniqueName: \"kubernetes.io/projected/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-kube-api-access-vhsvt\") pod \"e8fd5f7b-2a10-4593-9fed-b8bed2db29af\" (UID: \"e8fd5f7b-2a10-4593-9fed-b8bed2db29af\") " Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.836017 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-config\") pod \"e8fd5f7b-2a10-4593-9fed-b8bed2db29af\" (UID: \"e8fd5f7b-2a10-4593-9fed-b8bed2db29af\") " Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.836108 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-dns-svc\") pod \"e8fd5f7b-2a10-4593-9fed-b8bed2db29af\" (UID: \"e8fd5f7b-2a10-4593-9fed-b8bed2db29af\") " Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.836873 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e8fd5f7b-2a10-4593-9fed-b8bed2db29af" (UID: "e8fd5f7b-2a10-4593-9fed-b8bed2db29af"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.837117 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-config" (OuterVolumeSpecName: "config") pod "e8fd5f7b-2a10-4593-9fed-b8bed2db29af" (UID: "e8fd5f7b-2a10-4593-9fed-b8bed2db29af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.837906 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.837930 4745 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.845363 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-kube-api-access-vhsvt" (OuterVolumeSpecName: "kube-api-access-vhsvt") pod "e8fd5f7b-2a10-4593-9fed-b8bed2db29af" (UID: "e8fd5f7b-2a10-4593-9fed-b8bed2db29af"). InnerVolumeSpecName "kube-api-access-vhsvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:52:22 crc kubenswrapper[4745]: I1209 11:52:22.940934 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhsvt\" (UniqueName: \"kubernetes.io/projected/e8fd5f7b-2a10-4593-9fed-b8bed2db29af-kube-api-access-vhsvt\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.045225 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.061953 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f9f9f545f-sp6l7"] Dec 09 11:52:23 crc kubenswrapper[4745]: W1209 11:52:23.069710 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0ec42f1_88e6_4104_9272_eaf4f4ae7326.slice/crio-b0b6fdbe993f4b08578466ab8dbdb2d8f8d3b883317ab89d3a45b355bbb28d16 WatchSource:0}: Error finding container b0b6fdbe993f4b08578466ab8dbdb2d8f8d3b883317ab89d3a45b355bbb28d16: Status 404 returned error can't find the container with id b0b6fdbe993f4b08578466ab8dbdb2d8f8d3b883317ab89d3a45b355bbb28d16 Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.231893 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.241390 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.243643 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-s6r47" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.243666 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.244658 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.245159 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.259290 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.317548 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" event={"ID":"d0ec42f1-88e6-4104-9272-eaf4f4ae7326","Type":"ContainerStarted","Data":"b0b6fdbe993f4b08578466ab8dbdb2d8f8d3b883317ab89d3a45b355bbb28d16"} Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.318422 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" event={"ID":"e8fd5f7b-2a10-4593-9fed-b8bed2db29af","Type":"ContainerDied","Data":"915f9486fd41905da37cca774c60f94d4aaea59e252993e88398c9c3951ce719"} Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.318528 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7cbb8f79-c4t4v" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.319872 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.350781 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-cache\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.350849 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-lock\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.350991 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8z72\" (UniqueName: \"kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-kube-api-access-v8z72\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.351019 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.351056 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.379533 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.396912 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-c4t4v"] Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.403167 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-c4t4v"] Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.453458 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-cache\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.453545 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-lock\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.453583 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8z72\" (UniqueName: \"kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-kube-api-access-v8z72\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.453612 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.453654 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: E1209 11:52:23.454352 4745 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 11:52:23 crc kubenswrapper[4745]: E1209 11:52:23.454385 4745 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 11:52:23 crc kubenswrapper[4745]: E1209 11:52:23.454445 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift podName:71ebc86b-4ef3-4d3f-911f-93036c9ce19b nodeName:}" failed. No retries permitted until 2025-12-09 11:52:23.954420671 +0000 UTC m=+1230.779622195 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift") pod "swift-storage-0" (UID: "71ebc86b-4ef3-4d3f-911f-93036c9ce19b") : configmap "swift-ring-files" not found Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.455240 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-cache\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.455381 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.455797 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-lock\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.483967 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8z72\" (UniqueName: \"kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-kube-api-access-v8z72\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.487809 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.618278 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8fd5f7b-2a10-4593-9fed-b8bed2db29af" path="/var/lib/kubelet/pods/e8fd5f7b-2a10-4593-9fed-b8bed2db29af/volumes" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.739964 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-n77pg"] Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.743263 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.756075 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.756106 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.756075 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.757809 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-n77pg"] Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.761117 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-dispersionconf\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.761156 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c20c4e93-62b6-4c52-9783-eca25d00c91f-scripts\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.761202 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c20c4e93-62b6-4c52-9783-eca25d00c91f-ring-data-devices\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.761246 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-swiftconf\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.761299 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-combined-ca-bundle\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.761324 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c20c4e93-62b6-4c52-9783-eca25d00c91f-etc-swift\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.761355 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxh64\" (UniqueName: \"kubernetes.io/projected/c20c4e93-62b6-4c52-9783-eca25d00c91f-kube-api-access-cxh64\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.794793 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-sqvnn"] Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.865582 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c20c4e93-62b6-4c52-9783-eca25d00c91f-ring-data-devices\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.865681 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-swiftconf\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.865719 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-combined-ca-bundle\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.865753 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c20c4e93-62b6-4c52-9783-eca25d00c91f-etc-swift\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.865791 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxh64\" (UniqueName: \"kubernetes.io/projected/c20c4e93-62b6-4c52-9783-eca25d00c91f-kube-api-access-cxh64\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.865845 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-dispersionconf\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.865874 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c20c4e93-62b6-4c52-9783-eca25d00c91f-scripts\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.867990 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c20c4e93-62b6-4c52-9783-eca25d00c91f-ring-data-devices\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.868684 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c20c4e93-62b6-4c52-9783-eca25d00c91f-etc-swift\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.868723 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bc67f99c-gtwzc"] Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.871784 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.876676 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.887465 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c20c4e93-62b6-4c52-9783-eca25d00c91f-scripts\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.890260 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxh64\" (UniqueName: \"kubernetes.io/projected/c20c4e93-62b6-4c52-9783-eca25d00c91f-kube-api-access-cxh64\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.901135 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-swiftconf\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.901897 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-dispersionconf\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.902100 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bc67f99c-gtwzc"] Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.924177 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-combined-ca-bundle\") pod \"swift-ring-rebalance-n77pg\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.967579 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:23 crc kubenswrapper[4745]: E1209 11:52:23.968093 4745 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 11:52:23 crc kubenswrapper[4745]: E1209 11:52:23.968107 4745 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 11:52:23 crc kubenswrapper[4745]: E1209 11:52:23.968147 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift podName:71ebc86b-4ef3-4d3f-911f-93036c9ce19b nodeName:}" failed. No retries permitted until 2025-12-09 11:52:24.968132623 +0000 UTC m=+1231.793334147 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift") pod "swift-storage-0" (UID: "71ebc86b-4ef3-4d3f-911f-93036c9ce19b") : configmap "swift-ring-files" not found Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.975101 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-dw6lx"] Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.976443 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.981005 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 09 11:52:23 crc kubenswrapper[4745]: I1209 11:52:23.991047 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-dw6lx"] Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.044889 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.070955 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-dns-svc\") pod \"dnsmasq-dns-86bc67f99c-gtwzc\" (UID: \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\") " pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.071025 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-ovsdbserver-sb\") pod \"dnsmasq-dns-86bc67f99c-gtwzc\" (UID: \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\") " pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.071096 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs6f9\" (UniqueName: \"kubernetes.io/projected/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-kube-api-access-qs6f9\") pod \"dnsmasq-dns-86bc67f99c-gtwzc\" (UID: \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\") " pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.071128 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-config\") pod \"dnsmasq-dns-86bc67f99c-gtwzc\" (UID: \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\") " pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.085089 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.173327 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.183882 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs6f9\" (UniqueName: \"kubernetes.io/projected/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-kube-api-access-qs6f9\") pod \"dnsmasq-dns-86bc67f99c-gtwzc\" (UID: \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\") " pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.184170 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-ovs-rundir\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.184250 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-config\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.184357 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrb5g\" (UniqueName: \"kubernetes.io/projected/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-kube-api-access-vrb5g\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.184440 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-config\") pod \"dnsmasq-dns-86bc67f99c-gtwzc\" (UID: \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\") " pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.184613 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-ovn-rundir\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.184872 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-dns-svc\") pod \"dnsmasq-dns-86bc67f99c-gtwzc\" (UID: \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\") " pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.185030 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-ovsdbserver-sb\") pod \"dnsmasq-dns-86bc67f99c-gtwzc\" (UID: \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\") " pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.185104 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-combined-ca-bundle\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.186453 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-config\") pod \"dnsmasq-dns-86bc67f99c-gtwzc\" (UID: \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\") " pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.187140 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-dns-svc\") pod \"dnsmasq-dns-86bc67f99c-gtwzc\" (UID: \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\") " pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.187757 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-ovsdbserver-sb\") pod \"dnsmasq-dns-86bc67f99c-gtwzc\" (UID: \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\") " pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.228034 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs6f9\" (UniqueName: \"kubernetes.io/projected/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-kube-api-access-qs6f9\") pod \"dnsmasq-dns-86bc67f99c-gtwzc\" (UID: \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\") " pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.290950 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-ovn-rundir\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.291369 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-combined-ca-bundle\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.291462 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.291595 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-ovs-rundir\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.291670 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-config\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.291766 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrb5g\" (UniqueName: \"kubernetes.io/projected/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-kube-api-access-vrb5g\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.292452 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-ovn-rundir\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.299003 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-config\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.305068 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-ovs-rundir\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.314809 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-combined-ca-bundle\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.322210 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.328725 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrb5g\" (UniqueName: \"kubernetes.io/projected/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-kube-api-access-vrb5g\") pod \"ovn-controller-metrics-dw6lx\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.347246 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.348714 4745 generic.go:334] "Generic (PLEG): container finished" podID="d0ec42f1-88e6-4104-9272-eaf4f4ae7326" containerID="54e86d21eea826709857d1503535c9b8ce243bbae076e53c04a450dc3d968a4a" exitCode=0 Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.352375 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" event={"ID":"d0ec42f1-88e6-4104-9272-eaf4f4ae7326","Type":"ContainerDied","Data":"54e86d21eea826709857d1503535c9b8ce243bbae076e53c04a450dc3d968a4a"} Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.431739 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.455547 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.461430 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.501601 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmgzz\" (UniqueName: \"kubernetes.io/projected/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-kube-api-access-tmgzz\") pod \"131e0189-9ecf-4e0d-825d-f3cae5c83e5b\" (UID: \"131e0189-9ecf-4e0d-825d-f3cae5c83e5b\") " Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.501653 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-config\") pod \"131e0189-9ecf-4e0d-825d-f3cae5c83e5b\" (UID: \"131e0189-9ecf-4e0d-825d-f3cae5c83e5b\") " Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.501679 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-dns-svc\") pod \"131e0189-9ecf-4e0d-825d-f3cae5c83e5b\" (UID: \"131e0189-9ecf-4e0d-825d-f3cae5c83e5b\") " Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.504890 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "131e0189-9ecf-4e0d-825d-f3cae5c83e5b" (UID: "131e0189-9ecf-4e0d-825d-f3cae5c83e5b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.505299 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-config" (OuterVolumeSpecName: "config") pod "131e0189-9ecf-4e0d-825d-f3cae5c83e5b" (UID: "131e0189-9ecf-4e0d-825d-f3cae5c83e5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.505386 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f9f9f545f-sp6l7"] Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.524535 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-kube-api-access-tmgzz" (OuterVolumeSpecName: "kube-api-access-tmgzz") pod "131e0189-9ecf-4e0d-825d-f3cae5c83e5b" (UID: "131e0189-9ecf-4e0d-825d-f3cae5c83e5b"). InnerVolumeSpecName "kube-api-access-tmgzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.524696 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.532688 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-dn8js"] Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.534601 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.537842 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.563769 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-dn8js"] Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.604031 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-dn8js\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.604670 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-dn8js\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.604744 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-config\") pod \"dnsmasq-dns-67fdf7998c-dn8js\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.604816 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mzjp\" (UniqueName: \"kubernetes.io/projected/12a5c3fa-478a-4716-b597-6b6b54eb274a-kube-api-access-7mzjp\") pod \"dnsmasq-dns-67fdf7998c-dn8js\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.604868 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-dn8js\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.604946 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.604962 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmgzz\" (UniqueName: \"kubernetes.io/projected/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-kube-api-access-tmgzz\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.604976 4745 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/131e0189-9ecf-4e0d-825d-f3cae5c83e5b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.706792 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-dn8js\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.706855 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-dn8js\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.706911 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-config\") pod \"dnsmasq-dns-67fdf7998c-dn8js\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.706956 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mzjp\" (UniqueName: \"kubernetes.io/projected/12a5c3fa-478a-4716-b597-6b6b54eb274a-kube-api-access-7mzjp\") pod \"dnsmasq-dns-67fdf7998c-dn8js\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.706993 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-dn8js\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.708467 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-dn8js\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.709229 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-dn8js\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.710055 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-config\") pod \"dnsmasq-dns-67fdf7998c-dn8js\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.719901 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-dn8js\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.741591 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mzjp\" (UniqueName: \"kubernetes.io/projected/12a5c3fa-478a-4716-b597-6b6b54eb274a-kube-api-access-7mzjp\") pod \"dnsmasq-dns-67fdf7998c-dn8js\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.817945 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.819606 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.829817 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.830115 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-hztdj" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.830273 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.830423 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.846360 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.858073 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-n77pg"] Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.882034 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.909677 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bc67f99c-gtwzc"] Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.910165 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2rpl\" (UniqueName: \"kubernetes.io/projected/8b9f661d-4261-4e54-883d-cb0e7479a3d2-kube-api-access-p2rpl\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.910205 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.910352 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f661d-4261-4e54-883d-cb0e7479a3d2-scripts\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.910385 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9f661d-4261-4e54-883d-cb0e7479a3d2-config\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.910437 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8b9f661d-4261-4e54-883d-cb0e7479a3d2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.910475 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:24 crc kubenswrapper[4745]: I1209 11:52:24.910498 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.012493 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.012582 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f661d-4261-4e54-883d-cb0e7479a3d2-scripts\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.012617 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9f661d-4261-4e54-883d-cb0e7479a3d2-config\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.012676 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8b9f661d-4261-4e54-883d-cb0e7479a3d2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:25 crc kubenswrapper[4745]: E1209 11:52:25.012993 4745 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 11:52:25 crc kubenswrapper[4745]: E1209 11:52:25.013029 4745 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 11:52:25 crc kubenswrapper[4745]: E1209 11:52:25.013102 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift podName:71ebc86b-4ef3-4d3f-911f-93036c9ce19b nodeName:}" failed. No retries permitted until 2025-12-09 11:52:27.013071199 +0000 UTC m=+1233.838272903 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift") pod "swift-storage-0" (UID: "71ebc86b-4ef3-4d3f-911f-93036c9ce19b") : configmap "swift-ring-files" not found Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.017743 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9f661d-4261-4e54-883d-cb0e7479a3d2-config\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.017864 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.017897 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8b9f661d-4261-4e54-883d-cb0e7479a3d2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.017983 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.018026 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2rpl\" (UniqueName: \"kubernetes.io/projected/8b9f661d-4261-4e54-883d-cb0e7479a3d2-kube-api-access-p2rpl\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.018062 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.020268 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f661d-4261-4e54-883d-cb0e7479a3d2-scripts\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.029618 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.029673 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.045666 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2rpl\" (UniqueName: \"kubernetes.io/projected/8b9f661d-4261-4e54-883d-cb0e7479a3d2-kube-api-access-p2rpl\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.060386 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " pod="openstack/ovn-northd-0" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.076615 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.135562 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-dw6lx"] Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.377767 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.377926 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-sqvnn" event={"ID":"131e0189-9ecf-4e0d-825d-f3cae5c83e5b","Type":"ContainerDied","Data":"72487293aabeb1703391ca9d92ae2bf656ecbf4f877fe719ccd377d190292138"} Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.389998 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" event={"ID":"d0ec42f1-88e6-4104-9272-eaf4f4ae7326","Type":"ContainerStarted","Data":"cc38660b4a24d8662d042062a769f27b262c81ec25304c547adf95ffb98e25b0"} Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.390068 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" podUID="d0ec42f1-88e6-4104-9272-eaf4f4ae7326" containerName="dnsmasq-dns" containerID="cri-o://cc38660b4a24d8662d042062a769f27b262c81ec25304c547adf95ffb98e25b0" gracePeriod=10 Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.390103 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.402690 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-dw6lx" event={"ID":"52964bb1-2d93-4df7-afbc-95f1eb10b8fc","Type":"ContainerStarted","Data":"b7740d4a42e4f85b15f623e06a6befc52cc38d311b5742d6e8f3cac547740344"} Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.405781 4745 generic.go:334] "Generic (PLEG): container finished" podID="3b719a4c-76c9-4ac2-9b63-265e8d2ddea4" containerID="b24b8ae89fd998bfc3e123d684130cfa9d42777fb053fedc95ae98ac5a4b972f" exitCode=0 Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.405839 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" event={"ID":"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4","Type":"ContainerDied","Data":"b24b8ae89fd998bfc3e123d684130cfa9d42777fb053fedc95ae98ac5a4b972f"} Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.405857 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" event={"ID":"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4","Type":"ContainerStarted","Data":"f5e518b67792d0064fe6f25a880e6b534e3daf4e4d8f197a79794a62ec7a8c50"} Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.410317 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n77pg" event={"ID":"c20c4e93-62b6-4c52-9783-eca25d00c91f","Type":"ContainerStarted","Data":"b12279cf395ed150c34134e42567108cd2c8da41ecc8fe7f8d1a0016d4498897"} Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.417651 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ceba626e-26d1-495f-b88d-fed69e445ddb","Type":"ContainerStarted","Data":"4b5baf1df440151578f138c2907e5ac84a040ebc7d009362d8e110c6311812ee"} Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.430906 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b860955-30eb-40e6-bd56-caf6098aed8a","Type":"ContainerStarted","Data":"3183cfb66289decc58035a93ed3c6db16f62f8769a1980e9dea7b5da9edf7ad3"} Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.442475 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" podStartSLOduration=3.048298798 podStartE2EDuration="3.442434061s" podCreationTimestamp="2025-12-09 11:52:22 +0000 UTC" firstStartedPulling="2025-12-09 11:52:23.073075052 +0000 UTC m=+1229.898276576" lastFinishedPulling="2025-12-09 11:52:23.467210315 +0000 UTC m=+1230.292411839" observedRunningTime="2025-12-09 11:52:25.41790723 +0000 UTC m=+1232.243108754" watchObservedRunningTime="2025-12-09 11:52:25.442434061 +0000 UTC m=+1232.267635585" Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.489114 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-dn8js"] Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.630369 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.693429 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-sqvnn"] Dec 09 11:52:25 crc kubenswrapper[4745]: I1209 11:52:25.714972 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-sqvnn"] Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.054563 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.159788 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-config\") pod \"d0ec42f1-88e6-4104-9272-eaf4f4ae7326\" (UID: \"d0ec42f1-88e6-4104-9272-eaf4f4ae7326\") " Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.160009 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgp7n\" (UniqueName: \"kubernetes.io/projected/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-kube-api-access-cgp7n\") pod \"d0ec42f1-88e6-4104-9272-eaf4f4ae7326\" (UID: \"d0ec42f1-88e6-4104-9272-eaf4f4ae7326\") " Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.160063 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-dns-svc\") pod \"d0ec42f1-88e6-4104-9272-eaf4f4ae7326\" (UID: \"d0ec42f1-88e6-4104-9272-eaf4f4ae7326\") " Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.166927 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-kube-api-access-cgp7n" (OuterVolumeSpecName: "kube-api-access-cgp7n") pod "d0ec42f1-88e6-4104-9272-eaf4f4ae7326" (UID: "d0ec42f1-88e6-4104-9272-eaf4f4ae7326"). InnerVolumeSpecName "kube-api-access-cgp7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.205648 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d0ec42f1-88e6-4104-9272-eaf4f4ae7326" (UID: "d0ec42f1-88e6-4104-9272-eaf4f4ae7326"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.208900 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-config" (OuterVolumeSpecName: "config") pod "d0ec42f1-88e6-4104-9272-eaf4f4ae7326" (UID: "d0ec42f1-88e6-4104-9272-eaf4f4ae7326"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.262680 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.262716 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgp7n\" (UniqueName: \"kubernetes.io/projected/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-kube-api-access-cgp7n\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.262730 4745 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0ec42f1-88e6-4104-9272-eaf4f4ae7326-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.453344 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8b9f661d-4261-4e54-883d-cb0e7479a3d2","Type":"ContainerStarted","Data":"367e9fb43e0608bca3ac04753c11274f0fa57e93d77bd613914c60ca5b8536b0"} Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.455577 4745 generic.go:334] "Generic (PLEG): container finished" podID="12a5c3fa-478a-4716-b597-6b6b54eb274a" containerID="5e819e4010b15dee9bf4896fb4269aaf967028ab3662e37643bf573ee3d16a54" exitCode=0 Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.455645 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" event={"ID":"12a5c3fa-478a-4716-b597-6b6b54eb274a","Type":"ContainerDied","Data":"5e819e4010b15dee9bf4896fb4269aaf967028ab3662e37643bf573ee3d16a54"} Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.455672 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" event={"ID":"12a5c3fa-478a-4716-b597-6b6b54eb274a","Type":"ContainerStarted","Data":"8f61721a5bd89a50ba16b8000257f20fb6efe1c08c924d446f41ff935e6a1844"} Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.459100 4745 generic.go:334] "Generic (PLEG): container finished" podID="d0ec42f1-88e6-4104-9272-eaf4f4ae7326" containerID="cc38660b4a24d8662d042062a769f27b262c81ec25304c547adf95ffb98e25b0" exitCode=0 Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.459164 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" event={"ID":"d0ec42f1-88e6-4104-9272-eaf4f4ae7326","Type":"ContainerDied","Data":"cc38660b4a24d8662d042062a769f27b262c81ec25304c547adf95ffb98e25b0"} Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.459183 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" event={"ID":"d0ec42f1-88e6-4104-9272-eaf4f4ae7326","Type":"ContainerDied","Data":"b0b6fdbe993f4b08578466ab8dbdb2d8f8d3b883317ab89d3a45b355bbb28d16"} Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.459202 4745 scope.go:117] "RemoveContainer" containerID="cc38660b4a24d8662d042062a769f27b262c81ec25304c547adf95ffb98e25b0" Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.459303 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9f9f545f-sp6l7" Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.465174 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-dw6lx" event={"ID":"52964bb1-2d93-4df7-afbc-95f1eb10b8fc","Type":"ContainerStarted","Data":"6c0e76c55419e28835b5addc2baf871683786704476377eadb9a576f9a3ff72e"} Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.467501 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" event={"ID":"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4","Type":"ContainerStarted","Data":"d388df09a36cc76fa0d853b7aac12a653aedd3f11c37b8df77d835920e088ad4"} Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.507196 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" podStartSLOduration=3.507164421 podStartE2EDuration="3.507164421s" podCreationTimestamp="2025-12-09 11:52:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:52:26.501683643 +0000 UTC m=+1233.326885157" watchObservedRunningTime="2025-12-09 11:52:26.507164421 +0000 UTC m=+1233.332365945" Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.532939 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-dw6lx" podStartSLOduration=3.532903744 podStartE2EDuration="3.532903744s" podCreationTimestamp="2025-12-09 11:52:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:52:26.519431791 +0000 UTC m=+1233.344633325" watchObservedRunningTime="2025-12-09 11:52:26.532903744 +0000 UTC m=+1233.358105268" Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.545989 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f9f9f545f-sp6l7"] Dec 09 11:52:26 crc kubenswrapper[4745]: I1209 11:52:26.557361 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f9f9f545f-sp6l7"] Dec 09 11:52:27 crc kubenswrapper[4745]: I1209 11:52:27.079813 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:27 crc kubenswrapper[4745]: E1209 11:52:27.080067 4745 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 11:52:27 crc kubenswrapper[4745]: E1209 11:52:27.080104 4745 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 11:52:27 crc kubenswrapper[4745]: E1209 11:52:27.080174 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift podName:71ebc86b-4ef3-4d3f-911f-93036c9ce19b nodeName:}" failed. No retries permitted until 2025-12-09 11:52:31.08015614 +0000 UTC m=+1237.905357664 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift") pod "swift-storage-0" (UID: "71ebc86b-4ef3-4d3f-911f-93036c9ce19b") : configmap "swift-ring-files" not found Dec 09 11:52:27 crc kubenswrapper[4745]: I1209 11:52:27.478680 4745 generic.go:334] "Generic (PLEG): container finished" podID="70a88ab6-2793-4952-b04c-9041a15e83f9" containerID="f2eada94a0c5b273d35a92a2faa4601639f853c423ecc3e3abc9140ff66f94e3" exitCode=0 Dec 09 11:52:27 crc kubenswrapper[4745]: I1209 11:52:27.479422 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"70a88ab6-2793-4952-b04c-9041a15e83f9","Type":"ContainerDied","Data":"f2eada94a0c5b273d35a92a2faa4601639f853c423ecc3e3abc9140ff66f94e3"} Dec 09 11:52:27 crc kubenswrapper[4745]: I1209 11:52:27.479827 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:27 crc kubenswrapper[4745]: I1209 11:52:27.567214 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="131e0189-9ecf-4e0d-825d-f3cae5c83e5b" path="/var/lib/kubelet/pods/131e0189-9ecf-4e0d-825d-f3cae5c83e5b/volumes" Dec 09 11:52:27 crc kubenswrapper[4745]: I1209 11:52:27.567629 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0ec42f1-88e6-4104-9272-eaf4f4ae7326" path="/var/lib/kubelet/pods/d0ec42f1-88e6-4104-9272-eaf4f4ae7326/volumes" Dec 09 11:52:31 crc kubenswrapper[4745]: I1209 11:52:31.169536 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:31 crc kubenswrapper[4745]: E1209 11:52:31.169821 4745 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 11:52:31 crc kubenswrapper[4745]: E1209 11:52:31.170048 4745 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 11:52:31 crc kubenswrapper[4745]: E1209 11:52:31.170120 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift podName:71ebc86b-4ef3-4d3f-911f-93036c9ce19b nodeName:}" failed. No retries permitted until 2025-12-09 11:52:39.170101168 +0000 UTC m=+1245.995302692 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift") pod "swift-storage-0" (UID: "71ebc86b-4ef3-4d3f-911f-93036c9ce19b") : configmap "swift-ring-files" not found Dec 09 11:52:32 crc kubenswrapper[4745]: I1209 11:52:32.708725 4745 scope.go:117] "RemoveContainer" containerID="54e86d21eea826709857d1503535c9b8ce243bbae076e53c04a450dc3d968a4a" Dec 09 11:52:32 crc kubenswrapper[4745]: I1209 11:52:32.766693 4745 scope.go:117] "RemoveContainer" containerID="cc38660b4a24d8662d042062a769f27b262c81ec25304c547adf95ffb98e25b0" Dec 09 11:52:32 crc kubenswrapper[4745]: E1209 11:52:32.768989 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc38660b4a24d8662d042062a769f27b262c81ec25304c547adf95ffb98e25b0\": container with ID starting with cc38660b4a24d8662d042062a769f27b262c81ec25304c547adf95ffb98e25b0 not found: ID does not exist" containerID="cc38660b4a24d8662d042062a769f27b262c81ec25304c547adf95ffb98e25b0" Dec 09 11:52:32 crc kubenswrapper[4745]: I1209 11:52:32.769055 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc38660b4a24d8662d042062a769f27b262c81ec25304c547adf95ffb98e25b0"} err="failed to get container status \"cc38660b4a24d8662d042062a769f27b262c81ec25304c547adf95ffb98e25b0\": rpc error: code = NotFound desc = could not find container \"cc38660b4a24d8662d042062a769f27b262c81ec25304c547adf95ffb98e25b0\": container with ID starting with cc38660b4a24d8662d042062a769f27b262c81ec25304c547adf95ffb98e25b0 not found: ID does not exist" Dec 09 11:52:32 crc kubenswrapper[4745]: I1209 11:52:32.769080 4745 scope.go:117] "RemoveContainer" containerID="54e86d21eea826709857d1503535c9b8ce243bbae076e53c04a450dc3d968a4a" Dec 09 11:52:32 crc kubenswrapper[4745]: E1209 11:52:32.770163 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54e86d21eea826709857d1503535c9b8ce243bbae076e53c04a450dc3d968a4a\": container with ID starting with 54e86d21eea826709857d1503535c9b8ce243bbae076e53c04a450dc3d968a4a not found: ID does not exist" containerID="54e86d21eea826709857d1503535c9b8ce243bbae076e53c04a450dc3d968a4a" Dec 09 11:52:32 crc kubenswrapper[4745]: I1209 11:52:32.770182 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54e86d21eea826709857d1503535c9b8ce243bbae076e53c04a450dc3d968a4a"} err="failed to get container status \"54e86d21eea826709857d1503535c9b8ce243bbae076e53c04a450dc3d968a4a\": rpc error: code = NotFound desc = could not find container \"54e86d21eea826709857d1503535c9b8ce243bbae076e53c04a450dc3d968a4a\": container with ID starting with 54e86d21eea826709857d1503535c9b8ce243bbae076e53c04a450dc3d968a4a not found: ID does not exist" Dec 09 11:52:33 crc kubenswrapper[4745]: I1209 11:52:33.549912 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" event={"ID":"12a5c3fa-478a-4716-b597-6b6b54eb274a","Type":"ContainerStarted","Data":"c328f257fb9e615113feddb465766fca761be4e1ddccefe990271d40397dbcb8"} Dec 09 11:52:33 crc kubenswrapper[4745]: I1209 11:52:33.550525 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:33 crc kubenswrapper[4745]: I1209 11:52:33.552689 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"94a3f188-f451-4895-b500-52a9f7877d00","Type":"ContainerStarted","Data":"c14f0fa5e7076677ce1b3c78038a8595a6de35d44d5b0b1d076e1b999699eed6"} Dec 09 11:52:33 crc kubenswrapper[4745]: I1209 11:52:33.562827 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n77pg" event={"ID":"c20c4e93-62b6-4c52-9783-eca25d00c91f","Type":"ContainerStarted","Data":"3bd3624dff82e436388a8bce15114930fc110be43138332c217c6c5dc788a655"} Dec 09 11:52:33 crc kubenswrapper[4745]: I1209 11:52:33.562875 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"70a88ab6-2793-4952-b04c-9041a15e83f9","Type":"ContainerStarted","Data":"ebc3184d8235e913b37b06d481dcea6c9617fbc8229add4be26ceee31189956c"} Dec 09 11:52:33 crc kubenswrapper[4745]: I1209 11:52:33.562888 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8b9f661d-4261-4e54-883d-cb0e7479a3d2","Type":"ContainerStarted","Data":"ef71441cc3ee46bc1ee03800d7cc412e88722d8e33ae675352d43ada5e7b3e84"} Dec 09 11:52:33 crc kubenswrapper[4745]: I1209 11:52:33.562899 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8b9f661d-4261-4e54-883d-cb0e7479a3d2","Type":"ContainerStarted","Data":"f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c"} Dec 09 11:52:33 crc kubenswrapper[4745]: I1209 11:52:33.562912 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 09 11:52:33 crc kubenswrapper[4745]: I1209 11:52:33.575031 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" podStartSLOduration=9.575009224 podStartE2EDuration="9.575009224s" podCreationTimestamp="2025-12-09 11:52:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:52:33.570960545 +0000 UTC m=+1240.396162089" watchObservedRunningTime="2025-12-09 11:52:33.575009224 +0000 UTC m=+1240.400210748" Dec 09 11:52:33 crc kubenswrapper[4745]: I1209 11:52:33.605695 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-n77pg" podStartSLOduration=2.816826513 podStartE2EDuration="10.60567097s" podCreationTimestamp="2025-12-09 11:52:23 +0000 UTC" firstStartedPulling="2025-12-09 11:52:24.982849006 +0000 UTC m=+1231.808050530" lastFinishedPulling="2025-12-09 11:52:32.771693463 +0000 UTC m=+1239.596894987" observedRunningTime="2025-12-09 11:52:33.598074065 +0000 UTC m=+1240.423275609" watchObservedRunningTime="2025-12-09 11:52:33.60567097 +0000 UTC m=+1240.430872504" Dec 09 11:52:33 crc kubenswrapper[4745]: I1209 11:52:33.627471 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=15.093989269 podStartE2EDuration="55.627436226s" podCreationTimestamp="2025-12-09 11:51:38 +0000 UTC" firstStartedPulling="2025-12-09 11:51:41.048779141 +0000 UTC m=+1187.873980665" lastFinishedPulling="2025-12-09 11:52:21.582226088 +0000 UTC m=+1228.407427622" observedRunningTime="2025-12-09 11:52:33.62055061 +0000 UTC m=+1240.445752144" watchObservedRunningTime="2025-12-09 11:52:33.627436226 +0000 UTC m=+1240.452637750" Dec 09 11:52:33 crc kubenswrapper[4745]: I1209 11:52:33.646055 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.5805019160000002 podStartE2EDuration="9.646037157s" podCreationTimestamp="2025-12-09 11:52:24 +0000 UTC" firstStartedPulling="2025-12-09 11:52:25.702140484 +0000 UTC m=+1232.527342008" lastFinishedPulling="2025-12-09 11:52:32.767675725 +0000 UTC m=+1239.592877249" observedRunningTime="2025-12-09 11:52:33.639333866 +0000 UTC m=+1240.464535390" watchObservedRunningTime="2025-12-09 11:52:33.646037157 +0000 UTC m=+1240.471238681" Dec 09 11:52:34 crc kubenswrapper[4745]: I1209 11:52:34.457964 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:38 crc kubenswrapper[4745]: I1209 11:52:38.619037 4745 generic.go:334] "Generic (PLEG): container finished" podID="94a3f188-f451-4895-b500-52a9f7877d00" containerID="c14f0fa5e7076677ce1b3c78038a8595a6de35d44d5b0b1d076e1b999699eed6" exitCode=0 Dec 09 11:52:38 crc kubenswrapper[4745]: I1209 11:52:38.619130 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"94a3f188-f451-4895-b500-52a9f7877d00","Type":"ContainerDied","Data":"c14f0fa5e7076677ce1b3c78038a8595a6de35d44d5b0b1d076e1b999699eed6"} Dec 09 11:52:39 crc kubenswrapper[4745]: I1209 11:52:39.237654 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:39 crc kubenswrapper[4745]: E1209 11:52:39.237885 4745 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 11:52:39 crc kubenswrapper[4745]: E1209 11:52:39.238116 4745 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 11:52:39 crc kubenswrapper[4745]: E1209 11:52:39.238177 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift podName:71ebc86b-4ef3-4d3f-911f-93036c9ce19b nodeName:}" failed. No retries permitted until 2025-12-09 11:52:55.238160733 +0000 UTC m=+1262.063362257 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift") pod "swift-storage-0" (UID: "71ebc86b-4ef3-4d3f-911f-93036c9ce19b") : configmap "swift-ring-files" not found Dec 09 11:52:39 crc kubenswrapper[4745]: I1209 11:52:39.628169 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"94a3f188-f451-4895-b500-52a9f7877d00","Type":"ContainerStarted","Data":"38929371712d3d60dc685cf4a33c96454c4b248c83eff828bff29196ac2964c6"} Dec 09 11:52:39 crc kubenswrapper[4745]: I1209 11:52:39.661980 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371973.192814 podStartE2EDuration="1m3.661962555s" podCreationTimestamp="2025-12-09 11:51:36 +0000 UTC" firstStartedPulling="2025-12-09 11:51:40.824229165 +0000 UTC m=+1187.649430689" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:52:39.659994612 +0000 UTC m=+1246.485196156" watchObservedRunningTime="2025-12-09 11:52:39.661962555 +0000 UTC m=+1246.487164079" Dec 09 11:52:39 crc kubenswrapper[4745]: I1209 11:52:39.884711 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:52:39 crc kubenswrapper[4745]: I1209 11:52:39.966629 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bc67f99c-gtwzc"] Dec 09 11:52:39 crc kubenswrapper[4745]: I1209 11:52:39.967209 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" podUID="3b719a4c-76c9-4ac2-9b63-265e8d2ddea4" containerName="dnsmasq-dns" containerID="cri-o://d388df09a36cc76fa0d853b7aac12a653aedd3f11c37b8df77d835920e088ad4" gracePeriod=10 Dec 09 11:52:40 crc kubenswrapper[4745]: I1209 11:52:40.472929 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 09 11:52:40 crc kubenswrapper[4745]: I1209 11:52:40.474208 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 09 11:52:40 crc kubenswrapper[4745]: I1209 11:52:40.638168 4745 generic.go:334] "Generic (PLEG): container finished" podID="3b719a4c-76c9-4ac2-9b63-265e8d2ddea4" containerID="d388df09a36cc76fa0d853b7aac12a653aedd3f11c37b8df77d835920e088ad4" exitCode=0 Dec 09 11:52:40 crc kubenswrapper[4745]: I1209 11:52:40.638223 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" event={"ID":"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4","Type":"ContainerDied","Data":"d388df09a36cc76fa0d853b7aac12a653aedd3f11c37b8df77d835920e088ad4"} Dec 09 11:52:40 crc kubenswrapper[4745]: I1209 11:52:40.721243 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.025916 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.366900 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.483402 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-ovsdbserver-sb\") pod \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\" (UID: \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\") " Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.483457 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-config\") pod \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\" (UID: \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\") " Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.483610 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-dns-svc\") pod \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\" (UID: \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\") " Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.483650 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs6f9\" (UniqueName: \"kubernetes.io/projected/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-kube-api-access-qs6f9\") pod \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\" (UID: \"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4\") " Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.490637 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-kube-api-access-qs6f9" (OuterVolumeSpecName: "kube-api-access-qs6f9") pod "3b719a4c-76c9-4ac2-9b63-265e8d2ddea4" (UID: "3b719a4c-76c9-4ac2-9b63-265e8d2ddea4"). InnerVolumeSpecName "kube-api-access-qs6f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.525944 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b719a4c-76c9-4ac2-9b63-265e8d2ddea4" (UID: "3b719a4c-76c9-4ac2-9b63-265e8d2ddea4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.536140 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-config" (OuterVolumeSpecName: "config") pod "3b719a4c-76c9-4ac2-9b63-265e8d2ddea4" (UID: "3b719a4c-76c9-4ac2-9b63-265e8d2ddea4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.537957 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b719a4c-76c9-4ac2-9b63-265e8d2ddea4" (UID: "3b719a4c-76c9-4ac2-9b63-265e8d2ddea4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.588708 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.588739 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.588749 4745 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.588759 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs6f9\" (UniqueName: \"kubernetes.io/projected/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4-kube-api-access-qs6f9\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.648395 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" event={"ID":"3b719a4c-76c9-4ac2-9b63-265e8d2ddea4","Type":"ContainerDied","Data":"f5e518b67792d0064fe6f25a880e6b534e3daf4e4d8f197a79794a62ec7a8c50"} Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.648432 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bc67f99c-gtwzc" Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.648471 4745 scope.go:117] "RemoveContainer" containerID="d388df09a36cc76fa0d853b7aac12a653aedd3f11c37b8df77d835920e088ad4" Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.650435 4745 generic.go:334] "Generic (PLEG): container finished" podID="c20c4e93-62b6-4c52-9783-eca25d00c91f" containerID="3bd3624dff82e436388a8bce15114930fc110be43138332c217c6c5dc788a655" exitCode=0 Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.650609 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n77pg" event={"ID":"c20c4e93-62b6-4c52-9783-eca25d00c91f","Type":"ContainerDied","Data":"3bd3624dff82e436388a8bce15114930fc110be43138332c217c6c5dc788a655"} Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.689454 4745 scope.go:117] "RemoveContainer" containerID="b24b8ae89fd998bfc3e123d684130cfa9d42777fb053fedc95ae98ac5a4b972f" Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.700032 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bc67f99c-gtwzc"] Dec 09 11:52:41 crc kubenswrapper[4745]: I1209 11:52:41.707469 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bc67f99c-gtwzc"] Dec 09 11:52:42 crc kubenswrapper[4745]: I1209 11:52:42.964552 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.030108 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxh64\" (UniqueName: \"kubernetes.io/projected/c20c4e93-62b6-4c52-9783-eca25d00c91f-kube-api-access-cxh64\") pod \"c20c4e93-62b6-4c52-9783-eca25d00c91f\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.030422 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c20c4e93-62b6-4c52-9783-eca25d00c91f-ring-data-devices\") pod \"c20c4e93-62b6-4c52-9783-eca25d00c91f\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.030460 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c20c4e93-62b6-4c52-9783-eca25d00c91f-etc-swift\") pod \"c20c4e93-62b6-4c52-9783-eca25d00c91f\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.030523 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-combined-ca-bundle\") pod \"c20c4e93-62b6-4c52-9783-eca25d00c91f\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.030540 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-swiftconf\") pod \"c20c4e93-62b6-4c52-9783-eca25d00c91f\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.030580 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-dispersionconf\") pod \"c20c4e93-62b6-4c52-9783-eca25d00c91f\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.030604 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c20c4e93-62b6-4c52-9783-eca25d00c91f-scripts\") pod \"c20c4e93-62b6-4c52-9783-eca25d00c91f\" (UID: \"c20c4e93-62b6-4c52-9783-eca25d00c91f\") " Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.031169 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c20c4e93-62b6-4c52-9783-eca25d00c91f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c20c4e93-62b6-4c52-9783-eca25d00c91f" (UID: "c20c4e93-62b6-4c52-9783-eca25d00c91f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.031852 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c20c4e93-62b6-4c52-9783-eca25d00c91f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c20c4e93-62b6-4c52-9783-eca25d00c91f" (UID: "c20c4e93-62b6-4c52-9783-eca25d00c91f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.040961 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c20c4e93-62b6-4c52-9783-eca25d00c91f-kube-api-access-cxh64" (OuterVolumeSpecName: "kube-api-access-cxh64") pod "c20c4e93-62b6-4c52-9783-eca25d00c91f" (UID: "c20c4e93-62b6-4c52-9783-eca25d00c91f"). InnerVolumeSpecName "kube-api-access-cxh64". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.041580 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c20c4e93-62b6-4c52-9783-eca25d00c91f" (UID: "c20c4e93-62b6-4c52-9783-eca25d00c91f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.057422 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c20c4e93-62b6-4c52-9783-eca25d00c91f" (UID: "c20c4e93-62b6-4c52-9783-eca25d00c91f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.061891 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c20c4e93-62b6-4c52-9783-eca25d00c91f" (UID: "c20c4e93-62b6-4c52-9783-eca25d00c91f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.062228 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c20c4e93-62b6-4c52-9783-eca25d00c91f-scripts" (OuterVolumeSpecName: "scripts") pod "c20c4e93-62b6-4c52-9783-eca25d00c91f" (UID: "c20c4e93-62b6-4c52-9783-eca25d00c91f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.133169 4745 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.133206 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c20c4e93-62b6-4c52-9783-eca25d00c91f-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.133216 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxh64\" (UniqueName: \"kubernetes.io/projected/c20c4e93-62b6-4c52-9783-eca25d00c91f-kube-api-access-cxh64\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.133227 4745 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c20c4e93-62b6-4c52-9783-eca25d00c91f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.133236 4745 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c20c4e93-62b6-4c52-9783-eca25d00c91f-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.133246 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.133255 4745 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c20c4e93-62b6-4c52-9783-eca25d00c91f-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.571255 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b719a4c-76c9-4ac2-9b63-265e8d2ddea4" path="/var/lib/kubelet/pods/3b719a4c-76c9-4ac2-9b63-265e8d2ddea4/volumes" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.673942 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n77pg" event={"ID":"c20c4e93-62b6-4c52-9783-eca25d00c91f","Type":"ContainerDied","Data":"b12279cf395ed150c34134e42567108cd2c8da41ecc8fe7f8d1a0016d4498897"} Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.673994 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b12279cf395ed150c34134e42567108cd2c8da41ecc8fe7f8d1a0016d4498897" Dec 09 11:52:43 crc kubenswrapper[4745]: I1209 11:52:43.674083 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n77pg" Dec 09 11:52:45 crc kubenswrapper[4745]: I1209 11:52:45.140953 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 09 11:52:48 crc kubenswrapper[4745]: I1209 11:52:48.430082 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 09 11:52:48 crc kubenswrapper[4745]: I1209 11:52:48.430417 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 09 11:52:48 crc kubenswrapper[4745]: I1209 11:52:48.521965 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 09 11:52:48 crc kubenswrapper[4745]: I1209 11:52:48.786677 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.703664 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-r4f68"] Dec 09 11:52:49 crc kubenswrapper[4745]: E1209 11:52:49.704216 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c20c4e93-62b6-4c52-9783-eca25d00c91f" containerName="swift-ring-rebalance" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.704237 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c20c4e93-62b6-4c52-9783-eca25d00c91f" containerName="swift-ring-rebalance" Dec 09 11:52:49 crc kubenswrapper[4745]: E1209 11:52:49.704257 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0ec42f1-88e6-4104-9272-eaf4f4ae7326" containerName="init" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.704267 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0ec42f1-88e6-4104-9272-eaf4f4ae7326" containerName="init" Dec 09 11:52:49 crc kubenswrapper[4745]: E1209 11:52:49.704289 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b719a4c-76c9-4ac2-9b63-265e8d2ddea4" containerName="dnsmasq-dns" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.704297 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b719a4c-76c9-4ac2-9b63-265e8d2ddea4" containerName="dnsmasq-dns" Dec 09 11:52:49 crc kubenswrapper[4745]: E1209 11:52:49.704321 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0ec42f1-88e6-4104-9272-eaf4f4ae7326" containerName="dnsmasq-dns" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.704329 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0ec42f1-88e6-4104-9272-eaf4f4ae7326" containerName="dnsmasq-dns" Dec 09 11:52:49 crc kubenswrapper[4745]: E1209 11:52:49.704410 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b719a4c-76c9-4ac2-9b63-265e8d2ddea4" containerName="init" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.704419 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b719a4c-76c9-4ac2-9b63-265e8d2ddea4" containerName="init" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.704758 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c20c4e93-62b6-4c52-9783-eca25d00c91f" containerName="swift-ring-rebalance" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.704816 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b719a4c-76c9-4ac2-9b63-265e8d2ddea4" containerName="dnsmasq-dns" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.704833 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0ec42f1-88e6-4104-9272-eaf4f4ae7326" containerName="dnsmasq-dns" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.706130 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-r4f68" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.718640 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-r4f68"] Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.754352 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nhtf\" (UniqueName: \"kubernetes.io/projected/ca1b624b-3936-4cec-a1ba-b4efa1504020-kube-api-access-5nhtf\") pod \"keystone-db-create-r4f68\" (UID: \"ca1b624b-3936-4cec-a1ba-b4efa1504020\") " pod="openstack/keystone-db-create-r4f68" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.754445 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca1b624b-3936-4cec-a1ba-b4efa1504020-operator-scripts\") pod \"keystone-db-create-r4f68\" (UID: \"ca1b624b-3936-4cec-a1ba-b4efa1504020\") " pod="openstack/keystone-db-create-r4f68" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.782786 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5bf8-account-create-update-pxtd8"] Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.784094 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bf8-account-create-update-pxtd8" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.786666 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.791858 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5bf8-account-create-update-pxtd8"] Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.856191 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q825\" (UniqueName: \"kubernetes.io/projected/311eddda-625a-4029-ba56-b408b2242eb5-kube-api-access-4q825\") pod \"keystone-5bf8-account-create-update-pxtd8\" (UID: \"311eddda-625a-4029-ba56-b408b2242eb5\") " pod="openstack/keystone-5bf8-account-create-update-pxtd8" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.856655 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca1b624b-3936-4cec-a1ba-b4efa1504020-operator-scripts\") pod \"keystone-db-create-r4f68\" (UID: \"ca1b624b-3936-4cec-a1ba-b4efa1504020\") " pod="openstack/keystone-db-create-r4f68" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.857123 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nhtf\" (UniqueName: \"kubernetes.io/projected/ca1b624b-3936-4cec-a1ba-b4efa1504020-kube-api-access-5nhtf\") pod \"keystone-db-create-r4f68\" (UID: \"ca1b624b-3936-4cec-a1ba-b4efa1504020\") " pod="openstack/keystone-db-create-r4f68" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.857251 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/311eddda-625a-4029-ba56-b408b2242eb5-operator-scripts\") pod \"keystone-5bf8-account-create-update-pxtd8\" (UID: \"311eddda-625a-4029-ba56-b408b2242eb5\") " pod="openstack/keystone-5bf8-account-create-update-pxtd8" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.858305 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca1b624b-3936-4cec-a1ba-b4efa1504020-operator-scripts\") pod \"keystone-db-create-r4f68\" (UID: \"ca1b624b-3936-4cec-a1ba-b4efa1504020\") " pod="openstack/keystone-db-create-r4f68" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.885894 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nhtf\" (UniqueName: \"kubernetes.io/projected/ca1b624b-3936-4cec-a1ba-b4efa1504020-kube-api-access-5nhtf\") pod \"keystone-db-create-r4f68\" (UID: \"ca1b624b-3936-4cec-a1ba-b4efa1504020\") " pod="openstack/keystone-db-create-r4f68" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.961407 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/311eddda-625a-4029-ba56-b408b2242eb5-operator-scripts\") pod \"keystone-5bf8-account-create-update-pxtd8\" (UID: \"311eddda-625a-4029-ba56-b408b2242eb5\") " pod="openstack/keystone-5bf8-account-create-update-pxtd8" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.961547 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q825\" (UniqueName: \"kubernetes.io/projected/311eddda-625a-4029-ba56-b408b2242eb5-kube-api-access-4q825\") pod \"keystone-5bf8-account-create-update-pxtd8\" (UID: \"311eddda-625a-4029-ba56-b408b2242eb5\") " pod="openstack/keystone-5bf8-account-create-update-pxtd8" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.962668 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/311eddda-625a-4029-ba56-b408b2242eb5-operator-scripts\") pod \"keystone-5bf8-account-create-update-pxtd8\" (UID: \"311eddda-625a-4029-ba56-b408b2242eb5\") " pod="openstack/keystone-5bf8-account-create-update-pxtd8" Dec 09 11:52:49 crc kubenswrapper[4745]: I1209 11:52:49.979035 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q825\" (UniqueName: \"kubernetes.io/projected/311eddda-625a-4029-ba56-b408b2242eb5-kube-api-access-4q825\") pod \"keystone-5bf8-account-create-update-pxtd8\" (UID: \"311eddda-625a-4029-ba56-b408b2242eb5\") " pod="openstack/keystone-5bf8-account-create-update-pxtd8" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.041132 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-r4f68" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.110395 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bf8-account-create-update-pxtd8" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.380685 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-2kw4j"] Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.382307 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2kw4j" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.389419 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2kw4j"] Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.472701 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4cd7\" (UniqueName: \"kubernetes.io/projected/ca2927f7-0512-4059-8f10-487ac6fbbad0-kube-api-access-l4cd7\") pod \"placement-db-create-2kw4j\" (UID: \"ca2927f7-0512-4059-8f10-487ac6fbbad0\") " pod="openstack/placement-db-create-2kw4j" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.472772 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca2927f7-0512-4059-8f10-487ac6fbbad0-operator-scripts\") pod \"placement-db-create-2kw4j\" (UID: \"ca2927f7-0512-4059-8f10-487ac6fbbad0\") " pod="openstack/placement-db-create-2kw4j" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.520462 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-r4f68"] Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.531460 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-980e-account-create-update-cllcd"] Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.532851 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-980e-account-create-update-cllcd" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.541969 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.550317 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-980e-account-create-update-cllcd"] Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.574712 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtf82\" (UniqueName: \"kubernetes.io/projected/5257a699-8e01-4460-9764-ae38c984495e-kube-api-access-qtf82\") pod \"placement-980e-account-create-update-cllcd\" (UID: \"5257a699-8e01-4460-9764-ae38c984495e\") " pod="openstack/placement-980e-account-create-update-cllcd" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.574788 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5257a699-8e01-4460-9764-ae38c984495e-operator-scripts\") pod \"placement-980e-account-create-update-cllcd\" (UID: \"5257a699-8e01-4460-9764-ae38c984495e\") " pod="openstack/placement-980e-account-create-update-cllcd" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.574987 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4cd7\" (UniqueName: \"kubernetes.io/projected/ca2927f7-0512-4059-8f10-487ac6fbbad0-kube-api-access-l4cd7\") pod \"placement-db-create-2kw4j\" (UID: \"ca2927f7-0512-4059-8f10-487ac6fbbad0\") " pod="openstack/placement-db-create-2kw4j" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.575023 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca2927f7-0512-4059-8f10-487ac6fbbad0-operator-scripts\") pod \"placement-db-create-2kw4j\" (UID: \"ca2927f7-0512-4059-8f10-487ac6fbbad0\") " pod="openstack/placement-db-create-2kw4j" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.588505 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca2927f7-0512-4059-8f10-487ac6fbbad0-operator-scripts\") pod \"placement-db-create-2kw4j\" (UID: \"ca2927f7-0512-4059-8f10-487ac6fbbad0\") " pod="openstack/placement-db-create-2kw4j" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.608249 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4cd7\" (UniqueName: \"kubernetes.io/projected/ca2927f7-0512-4059-8f10-487ac6fbbad0-kube-api-access-l4cd7\") pod \"placement-db-create-2kw4j\" (UID: \"ca2927f7-0512-4059-8f10-487ac6fbbad0\") " pod="openstack/placement-db-create-2kw4j" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.668529 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5bf8-account-create-update-pxtd8"] Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.676600 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtf82\" (UniqueName: \"kubernetes.io/projected/5257a699-8e01-4460-9764-ae38c984495e-kube-api-access-qtf82\") pod \"placement-980e-account-create-update-cllcd\" (UID: \"5257a699-8e01-4460-9764-ae38c984495e\") " pod="openstack/placement-980e-account-create-update-cllcd" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.676673 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5257a699-8e01-4460-9764-ae38c984495e-operator-scripts\") pod \"placement-980e-account-create-update-cllcd\" (UID: \"5257a699-8e01-4460-9764-ae38c984495e\") " pod="openstack/placement-980e-account-create-update-cllcd" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.678966 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5257a699-8e01-4460-9764-ae38c984495e-operator-scripts\") pod \"placement-980e-account-create-update-cllcd\" (UID: \"5257a699-8e01-4460-9764-ae38c984495e\") " pod="openstack/placement-980e-account-create-update-cllcd" Dec 09 11:52:50 crc kubenswrapper[4745]: W1209 11:52:50.679301 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod311eddda_625a_4029_ba56_b408b2242eb5.slice/crio-ff9a5a6e802cf0168d9d01d325c9e7da33cbb8d5f307314d40c85aa0596d29c7 WatchSource:0}: Error finding container ff9a5a6e802cf0168d9d01d325c9e7da33cbb8d5f307314d40c85aa0596d29c7: Status 404 returned error can't find the container with id ff9a5a6e802cf0168d9d01d325c9e7da33cbb8d5f307314d40c85aa0596d29c7 Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.698105 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtf82\" (UniqueName: \"kubernetes.io/projected/5257a699-8e01-4460-9764-ae38c984495e-kube-api-access-qtf82\") pod \"placement-980e-account-create-update-cllcd\" (UID: \"5257a699-8e01-4460-9764-ae38c984495e\") " pod="openstack/placement-980e-account-create-update-cllcd" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.707995 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2kw4j" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.760183 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bf8-account-create-update-pxtd8" event={"ID":"311eddda-625a-4029-ba56-b408b2242eb5","Type":"ContainerStarted","Data":"ff9a5a6e802cf0168d9d01d325c9e7da33cbb8d5f307314d40c85aa0596d29c7"} Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.761753 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-r4f68" event={"ID":"ca1b624b-3936-4cec-a1ba-b4efa1504020","Type":"ContainerStarted","Data":"4571aa0b6c65a337f5e02c0325f69e46bb7079de7dd610e205c196608f5ddffb"} Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.761779 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-r4f68" event={"ID":"ca1b624b-3936-4cec-a1ba-b4efa1504020","Type":"ContainerStarted","Data":"4ec894708a30636b948f4020797a6eac2437f085005461cd1b5ddb361e969017"} Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.787354 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-r4f68" podStartSLOduration=1.787333383 podStartE2EDuration="1.787333383s" podCreationTimestamp="2025-12-09 11:52:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:52:50.784152877 +0000 UTC m=+1257.609354411" watchObservedRunningTime="2025-12-09 11:52:50.787333383 +0000 UTC m=+1257.612534907" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.802985 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-rfdgn"] Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.804397 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rfdgn" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.821384 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rfdgn"] Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.875745 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-980e-account-create-update-cllcd" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.947266 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-4957-account-create-update-mgqk9"] Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.948714 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4957-account-create-update-mgqk9" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.956997 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.960059 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4957-account-create-update-mgqk9"] Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.993596 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdcqs\" (UniqueName: \"kubernetes.io/projected/390bc098-2623-4898-b666-0e615bfa815a-kube-api-access-kdcqs\") pod \"glance-db-create-rfdgn\" (UID: \"390bc098-2623-4898-b666-0e615bfa815a\") " pod="openstack/glance-db-create-rfdgn" Dec 09 11:52:50 crc kubenswrapper[4745]: I1209 11:52:50.993894 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/390bc098-2623-4898-b666-0e615bfa815a-operator-scripts\") pod \"glance-db-create-rfdgn\" (UID: \"390bc098-2623-4898-b666-0e615bfa815a\") " pod="openstack/glance-db-create-rfdgn" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.096010 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hwvw\" (UniqueName: \"kubernetes.io/projected/707152c6-0c94-4f5b-8b1a-0ca318f9fe92-kube-api-access-6hwvw\") pod \"glance-4957-account-create-update-mgqk9\" (UID: \"707152c6-0c94-4f5b-8b1a-0ca318f9fe92\") " pod="openstack/glance-4957-account-create-update-mgqk9" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.096437 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/707152c6-0c94-4f5b-8b1a-0ca318f9fe92-operator-scripts\") pod \"glance-4957-account-create-update-mgqk9\" (UID: \"707152c6-0c94-4f5b-8b1a-0ca318f9fe92\") " pod="openstack/glance-4957-account-create-update-mgqk9" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.096575 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/390bc098-2623-4898-b666-0e615bfa815a-operator-scripts\") pod \"glance-db-create-rfdgn\" (UID: \"390bc098-2623-4898-b666-0e615bfa815a\") " pod="openstack/glance-db-create-rfdgn" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.096713 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdcqs\" (UniqueName: \"kubernetes.io/projected/390bc098-2623-4898-b666-0e615bfa815a-kube-api-access-kdcqs\") pod \"glance-db-create-rfdgn\" (UID: \"390bc098-2623-4898-b666-0e615bfa815a\") " pod="openstack/glance-db-create-rfdgn" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.098112 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/390bc098-2623-4898-b666-0e615bfa815a-operator-scripts\") pod \"glance-db-create-rfdgn\" (UID: \"390bc098-2623-4898-b666-0e615bfa815a\") " pod="openstack/glance-db-create-rfdgn" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.141401 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdcqs\" (UniqueName: \"kubernetes.io/projected/390bc098-2623-4898-b666-0e615bfa815a-kube-api-access-kdcqs\") pod \"glance-db-create-rfdgn\" (UID: \"390bc098-2623-4898-b666-0e615bfa815a\") " pod="openstack/glance-db-create-rfdgn" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.141621 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rfdgn" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.198592 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/707152c6-0c94-4f5b-8b1a-0ca318f9fe92-operator-scripts\") pod \"glance-4957-account-create-update-mgqk9\" (UID: \"707152c6-0c94-4f5b-8b1a-0ca318f9fe92\") " pod="openstack/glance-4957-account-create-update-mgqk9" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.198779 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hwvw\" (UniqueName: \"kubernetes.io/projected/707152c6-0c94-4f5b-8b1a-0ca318f9fe92-kube-api-access-6hwvw\") pod \"glance-4957-account-create-update-mgqk9\" (UID: \"707152c6-0c94-4f5b-8b1a-0ca318f9fe92\") " pod="openstack/glance-4957-account-create-update-mgqk9" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.201249 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/707152c6-0c94-4f5b-8b1a-0ca318f9fe92-operator-scripts\") pod \"glance-4957-account-create-update-mgqk9\" (UID: \"707152c6-0c94-4f5b-8b1a-0ca318f9fe92\") " pod="openstack/glance-4957-account-create-update-mgqk9" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.248656 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hwvw\" (UniqueName: \"kubernetes.io/projected/707152c6-0c94-4f5b-8b1a-0ca318f9fe92-kube-api-access-6hwvw\") pod \"glance-4957-account-create-update-mgqk9\" (UID: \"707152c6-0c94-4f5b-8b1a-0ca318f9fe92\") " pod="openstack/glance-4957-account-create-update-mgqk9" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.288064 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4957-account-create-update-mgqk9" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.338939 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2kw4j"] Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.348161 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dqr9b" podUID="dcea58c8-1e21-4581-bcdb-7b1f88e8b463" containerName="ovn-controller" probeResult="failure" output=< Dec 09 11:52:51 crc kubenswrapper[4745]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 09 11:52:51 crc kubenswrapper[4745]: > Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.360974 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.403303 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.518833 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-980e-account-create-update-cllcd"] Dec 09 11:52:51 crc kubenswrapper[4745]: W1209 11:52:51.529195 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5257a699_8e01_4460_9764_ae38c984495e.slice/crio-b8baa54287e8d7f57e07e85f488b902c957291d8ed57aeac590733dbea384ac7 WatchSource:0}: Error finding container b8baa54287e8d7f57e07e85f488b902c957291d8ed57aeac590733dbea384ac7: Status 404 returned error can't find the container with id b8baa54287e8d7f57e07e85f488b902c957291d8ed57aeac590733dbea384ac7 Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.652452 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dqr9b-config-swpwg"] Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.654254 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.660274 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.710042 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dqr9b-config-swpwg"] Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.713346 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c8192c4-a534-4dda-ad08-12a189c53546-additional-scripts\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.713382 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-run\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.713418 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-log-ovn\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.713763 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms6vv\" (UniqueName: \"kubernetes.io/projected/1c8192c4-a534-4dda-ad08-12a189c53546-kube-api-access-ms6vv\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.713826 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c8192c4-a534-4dda-ad08-12a189c53546-scripts\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.713940 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-run-ovn\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: W1209 11:52:51.726724 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod390bc098_2623_4898_b666_0e615bfa815a.slice/crio-c5a6802b491ec28fc66488ad7516c092a194bd44a7b247cfc17df4c929a1ab7e WatchSource:0}: Error finding container c5a6802b491ec28fc66488ad7516c092a194bd44a7b247cfc17df4c929a1ab7e: Status 404 returned error can't find the container with id c5a6802b491ec28fc66488ad7516c092a194bd44a7b247cfc17df4c929a1ab7e Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.777156 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rfdgn"] Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.779047 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rfdgn" event={"ID":"390bc098-2623-4898-b666-0e615bfa815a","Type":"ContainerStarted","Data":"c5a6802b491ec28fc66488ad7516c092a194bd44a7b247cfc17df4c929a1ab7e"} Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.780665 4745 generic.go:334] "Generic (PLEG): container finished" podID="ca1b624b-3936-4cec-a1ba-b4efa1504020" containerID="4571aa0b6c65a337f5e02c0325f69e46bb7079de7dd610e205c196608f5ddffb" exitCode=0 Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.780714 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-r4f68" event={"ID":"ca1b624b-3936-4cec-a1ba-b4efa1504020","Type":"ContainerDied","Data":"4571aa0b6c65a337f5e02c0325f69e46bb7079de7dd610e205c196608f5ddffb"} Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.782821 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-980e-account-create-update-cllcd" event={"ID":"5257a699-8e01-4460-9764-ae38c984495e","Type":"ContainerStarted","Data":"336ca4940dee1ff5df0505b50698efc59baa43ad9f49067d6cb89f54fb4ca9ca"} Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.782849 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-980e-account-create-update-cllcd" event={"ID":"5257a699-8e01-4460-9764-ae38c984495e","Type":"ContainerStarted","Data":"b8baa54287e8d7f57e07e85f488b902c957291d8ed57aeac590733dbea384ac7"} Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.785110 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2kw4j" event={"ID":"ca2927f7-0512-4059-8f10-487ac6fbbad0","Type":"ContainerStarted","Data":"0637d530cebe187aef48d0bff50d3bd185303e1bb8780e94e67215ee199f6ccf"} Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.785160 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2kw4j" event={"ID":"ca2927f7-0512-4059-8f10-487ac6fbbad0","Type":"ContainerStarted","Data":"c254e833abe9ecae473f8917ed7f8fe6e3c17336d6c98941a5354e680e4f5aca"} Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.790252 4745 generic.go:334] "Generic (PLEG): container finished" podID="311eddda-625a-4029-ba56-b408b2242eb5" containerID="809fbd96612b450289eed7e2404a3b4187004e68892406d748f16d6cbe48d42c" exitCode=0 Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.791085 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bf8-account-create-update-pxtd8" event={"ID":"311eddda-625a-4029-ba56-b408b2242eb5","Type":"ContainerDied","Data":"809fbd96612b450289eed7e2404a3b4187004e68892406d748f16d6cbe48d42c"} Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.815726 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c8192c4-a534-4dda-ad08-12a189c53546-additional-scripts\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.815774 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-run\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.815810 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-log-ovn\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.816161 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-run\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.816172 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-log-ovn\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.815899 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms6vv\" (UniqueName: \"kubernetes.io/projected/1c8192c4-a534-4dda-ad08-12a189c53546-kube-api-access-ms6vv\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.816438 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c8192c4-a534-4dda-ad08-12a189c53546-scripts\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.816607 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-run-ovn\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.816697 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-run-ovn\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.816755 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c8192c4-a534-4dda-ad08-12a189c53546-additional-scripts\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.818890 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c8192c4-a534-4dda-ad08-12a189c53546-scripts\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.828670 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-2kw4j" podStartSLOduration=1.828639872 podStartE2EDuration="1.828639872s" podCreationTimestamp="2025-12-09 11:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:52:51.821216542 +0000 UTC m=+1258.646418086" watchObservedRunningTime="2025-12-09 11:52:51.828639872 +0000 UTC m=+1258.653841396" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.841467 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-980e-account-create-update-cllcd" podStartSLOduration=1.841438377 podStartE2EDuration="1.841438377s" podCreationTimestamp="2025-12-09 11:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:52:51.837802179 +0000 UTC m=+1258.663003703" watchObservedRunningTime="2025-12-09 11:52:51.841438377 +0000 UTC m=+1258.666639901" Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.846132 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms6vv\" (UniqueName: \"kubernetes.io/projected/1c8192c4-a534-4dda-ad08-12a189c53546-kube-api-access-ms6vv\") pod \"ovn-controller-dqr9b-config-swpwg\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:51 crc kubenswrapper[4745]: W1209 11:52:51.865250 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod707152c6_0c94_4f5b_8b1a_0ca318f9fe92.slice/crio-3ff80601570ac9840e1488297303a4257c3408b51c839c66e0b9cd8f17f8050b WatchSource:0}: Error finding container 3ff80601570ac9840e1488297303a4257c3408b51c839c66e0b9cd8f17f8050b: Status 404 returned error can't find the container with id 3ff80601570ac9840e1488297303a4257c3408b51c839c66e0b9cd8f17f8050b Dec 09 11:52:51 crc kubenswrapper[4745]: I1209 11:52:51.871274 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4957-account-create-update-mgqk9"] Dec 09 11:52:52 crc kubenswrapper[4745]: I1209 11:52:52.051168 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:52 crc kubenswrapper[4745]: I1209 11:52:52.483612 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dqr9b-config-swpwg"] Dec 09 11:52:52 crc kubenswrapper[4745]: I1209 11:52:52.806572 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dqr9b-config-swpwg" event={"ID":"1c8192c4-a534-4dda-ad08-12a189c53546","Type":"ContainerStarted","Data":"5698e77aa1e57119afef2f0a2e0f754cc78f10fd79added5c52ec7c05881d368"} Dec 09 11:52:52 crc kubenswrapper[4745]: I1209 11:52:52.808018 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dqr9b-config-swpwg" event={"ID":"1c8192c4-a534-4dda-ad08-12a189c53546","Type":"ContainerStarted","Data":"1d35859fcb291bfabe8f891a428294dfee3b4204eb636bb0f166eba9f1e53af9"} Dec 09 11:52:52 crc kubenswrapper[4745]: I1209 11:52:52.810341 4745 generic.go:334] "Generic (PLEG): container finished" podID="ca2927f7-0512-4059-8f10-487ac6fbbad0" containerID="0637d530cebe187aef48d0bff50d3bd185303e1bb8780e94e67215ee199f6ccf" exitCode=0 Dec 09 11:52:52 crc kubenswrapper[4745]: I1209 11:52:52.810451 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2kw4j" event={"ID":"ca2927f7-0512-4059-8f10-487ac6fbbad0","Type":"ContainerDied","Data":"0637d530cebe187aef48d0bff50d3bd185303e1bb8780e94e67215ee199f6ccf"} Dec 09 11:52:52 crc kubenswrapper[4745]: I1209 11:52:52.813854 4745 generic.go:334] "Generic (PLEG): container finished" podID="707152c6-0c94-4f5b-8b1a-0ca318f9fe92" containerID="a86fa369905db0b061fcf40232073cd840b343f60d8f376f143ad0c494a0513f" exitCode=0 Dec 09 11:52:52 crc kubenswrapper[4745]: I1209 11:52:52.814042 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4957-account-create-update-mgqk9" event={"ID":"707152c6-0c94-4f5b-8b1a-0ca318f9fe92","Type":"ContainerDied","Data":"a86fa369905db0b061fcf40232073cd840b343f60d8f376f143ad0c494a0513f"} Dec 09 11:52:52 crc kubenswrapper[4745]: I1209 11:52:52.814109 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4957-account-create-update-mgqk9" event={"ID":"707152c6-0c94-4f5b-8b1a-0ca318f9fe92","Type":"ContainerStarted","Data":"3ff80601570ac9840e1488297303a4257c3408b51c839c66e0b9cd8f17f8050b"} Dec 09 11:52:52 crc kubenswrapper[4745]: I1209 11:52:52.816335 4745 generic.go:334] "Generic (PLEG): container finished" podID="390bc098-2623-4898-b666-0e615bfa815a" containerID="0a4d00aad6ad2329c45c9e731e08b417c52114d065594d61af1d66b9e7473f8e" exitCode=0 Dec 09 11:52:52 crc kubenswrapper[4745]: I1209 11:52:52.816393 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rfdgn" event={"ID":"390bc098-2623-4898-b666-0e615bfa815a","Type":"ContainerDied","Data":"0a4d00aad6ad2329c45c9e731e08b417c52114d065594d61af1d66b9e7473f8e"} Dec 09 11:52:52 crc kubenswrapper[4745]: I1209 11:52:52.818363 4745 generic.go:334] "Generic (PLEG): container finished" podID="5257a699-8e01-4460-9764-ae38c984495e" containerID="336ca4940dee1ff5df0505b50698efc59baa43ad9f49067d6cb89f54fb4ca9ca" exitCode=0 Dec 09 11:52:52 crc kubenswrapper[4745]: I1209 11:52:52.818435 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-980e-account-create-update-cllcd" event={"ID":"5257a699-8e01-4460-9764-ae38c984495e","Type":"ContainerDied","Data":"336ca4940dee1ff5df0505b50698efc59baa43ad9f49067d6cb89f54fb4ca9ca"} Dec 09 11:52:52 crc kubenswrapper[4745]: I1209 11:52:52.836833 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dqr9b-config-swpwg" podStartSLOduration=1.836807149 podStartE2EDuration="1.836807149s" podCreationTimestamp="2025-12-09 11:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:52:52.826984394 +0000 UTC m=+1259.652185918" watchObservedRunningTime="2025-12-09 11:52:52.836807149 +0000 UTC m=+1259.662008663" Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.168668 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bf8-account-create-update-pxtd8" Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.249944 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/311eddda-625a-4029-ba56-b408b2242eb5-operator-scripts\") pod \"311eddda-625a-4029-ba56-b408b2242eb5\" (UID: \"311eddda-625a-4029-ba56-b408b2242eb5\") " Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.250434 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q825\" (UniqueName: \"kubernetes.io/projected/311eddda-625a-4029-ba56-b408b2242eb5-kube-api-access-4q825\") pod \"311eddda-625a-4029-ba56-b408b2242eb5\" (UID: \"311eddda-625a-4029-ba56-b408b2242eb5\") " Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.250850 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/311eddda-625a-4029-ba56-b408b2242eb5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "311eddda-625a-4029-ba56-b408b2242eb5" (UID: "311eddda-625a-4029-ba56-b408b2242eb5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.284404 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/311eddda-625a-4029-ba56-b408b2242eb5-kube-api-access-4q825" (OuterVolumeSpecName: "kube-api-access-4q825") pod "311eddda-625a-4029-ba56-b408b2242eb5" (UID: "311eddda-625a-4029-ba56-b408b2242eb5"). InnerVolumeSpecName "kube-api-access-4q825". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.352033 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/311eddda-625a-4029-ba56-b408b2242eb5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.352069 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q825\" (UniqueName: \"kubernetes.io/projected/311eddda-625a-4029-ba56-b408b2242eb5-kube-api-access-4q825\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.360044 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-r4f68" Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.453116 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nhtf\" (UniqueName: \"kubernetes.io/projected/ca1b624b-3936-4cec-a1ba-b4efa1504020-kube-api-access-5nhtf\") pod \"ca1b624b-3936-4cec-a1ba-b4efa1504020\" (UID: \"ca1b624b-3936-4cec-a1ba-b4efa1504020\") " Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.453227 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca1b624b-3936-4cec-a1ba-b4efa1504020-operator-scripts\") pod \"ca1b624b-3936-4cec-a1ba-b4efa1504020\" (UID: \"ca1b624b-3936-4cec-a1ba-b4efa1504020\") " Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.453866 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca1b624b-3936-4cec-a1ba-b4efa1504020-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca1b624b-3936-4cec-a1ba-b4efa1504020" (UID: "ca1b624b-3936-4cec-a1ba-b4efa1504020"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.456569 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1b624b-3936-4cec-a1ba-b4efa1504020-kube-api-access-5nhtf" (OuterVolumeSpecName: "kube-api-access-5nhtf") pod "ca1b624b-3936-4cec-a1ba-b4efa1504020" (UID: "ca1b624b-3936-4cec-a1ba-b4efa1504020"). InnerVolumeSpecName "kube-api-access-5nhtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.558141 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nhtf\" (UniqueName: \"kubernetes.io/projected/ca1b624b-3936-4cec-a1ba-b4efa1504020-kube-api-access-5nhtf\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.558180 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca1b624b-3936-4cec-a1ba-b4efa1504020-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.829964 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-r4f68" event={"ID":"ca1b624b-3936-4cec-a1ba-b4efa1504020","Type":"ContainerDied","Data":"4ec894708a30636b948f4020797a6eac2437f085005461cd1b5ddb361e969017"} Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.830207 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-r4f68" Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.830213 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ec894708a30636b948f4020797a6eac2437f085005461cd1b5ddb361e969017" Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.833089 4745 generic.go:334] "Generic (PLEG): container finished" podID="1c8192c4-a534-4dda-ad08-12a189c53546" containerID="5698e77aa1e57119afef2f0a2e0f754cc78f10fd79added5c52ec7c05881d368" exitCode=0 Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.833172 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dqr9b-config-swpwg" event={"ID":"1c8192c4-a534-4dda-ad08-12a189c53546","Type":"ContainerDied","Data":"5698e77aa1e57119afef2f0a2e0f754cc78f10fd79added5c52ec7c05881d368"} Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.835495 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bf8-account-create-update-pxtd8" event={"ID":"311eddda-625a-4029-ba56-b408b2242eb5","Type":"ContainerDied","Data":"ff9a5a6e802cf0168d9d01d325c9e7da33cbb8d5f307314d40c85aa0596d29c7"} Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.835557 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff9a5a6e802cf0168d9d01d325c9e7da33cbb8d5f307314d40c85aa0596d29c7" Dec 09 11:52:53 crc kubenswrapper[4745]: I1209 11:52:53.835598 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bf8-account-create-update-pxtd8" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.354192 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-980e-account-create-update-cllcd" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.455947 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rfdgn" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.486697 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2kw4j" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.495428 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4957-account-create-update-mgqk9" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.499619 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtf82\" (UniqueName: \"kubernetes.io/projected/5257a699-8e01-4460-9764-ae38c984495e-kube-api-access-qtf82\") pod \"5257a699-8e01-4460-9764-ae38c984495e\" (UID: \"5257a699-8e01-4460-9764-ae38c984495e\") " Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.499764 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5257a699-8e01-4460-9764-ae38c984495e-operator-scripts\") pod \"5257a699-8e01-4460-9764-ae38c984495e\" (UID: \"5257a699-8e01-4460-9764-ae38c984495e\") " Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.500791 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5257a699-8e01-4460-9764-ae38c984495e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5257a699-8e01-4460-9764-ae38c984495e" (UID: "5257a699-8e01-4460-9764-ae38c984495e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.510449 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5257a699-8e01-4460-9764-ae38c984495e-kube-api-access-qtf82" (OuterVolumeSpecName: "kube-api-access-qtf82") pod "5257a699-8e01-4460-9764-ae38c984495e" (UID: "5257a699-8e01-4460-9764-ae38c984495e"). InnerVolumeSpecName "kube-api-access-qtf82". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.601197 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/390bc098-2623-4898-b666-0e615bfa815a-operator-scripts\") pod \"390bc098-2623-4898-b666-0e615bfa815a\" (UID: \"390bc098-2623-4898-b666-0e615bfa815a\") " Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.601368 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdcqs\" (UniqueName: \"kubernetes.io/projected/390bc098-2623-4898-b666-0e615bfa815a-kube-api-access-kdcqs\") pod \"390bc098-2623-4898-b666-0e615bfa815a\" (UID: \"390bc098-2623-4898-b666-0e615bfa815a\") " Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.601451 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4cd7\" (UniqueName: \"kubernetes.io/projected/ca2927f7-0512-4059-8f10-487ac6fbbad0-kube-api-access-l4cd7\") pod \"ca2927f7-0512-4059-8f10-487ac6fbbad0\" (UID: \"ca2927f7-0512-4059-8f10-487ac6fbbad0\") " Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.601853 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/390bc098-2623-4898-b666-0e615bfa815a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "390bc098-2623-4898-b666-0e615bfa815a" (UID: "390bc098-2623-4898-b666-0e615bfa815a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.601960 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hwvw\" (UniqueName: \"kubernetes.io/projected/707152c6-0c94-4f5b-8b1a-0ca318f9fe92-kube-api-access-6hwvw\") pod \"707152c6-0c94-4f5b-8b1a-0ca318f9fe92\" (UID: \"707152c6-0c94-4f5b-8b1a-0ca318f9fe92\") " Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.602041 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/707152c6-0c94-4f5b-8b1a-0ca318f9fe92-operator-scripts\") pod \"707152c6-0c94-4f5b-8b1a-0ca318f9fe92\" (UID: \"707152c6-0c94-4f5b-8b1a-0ca318f9fe92\") " Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.602064 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca2927f7-0512-4059-8f10-487ac6fbbad0-operator-scripts\") pod \"ca2927f7-0512-4059-8f10-487ac6fbbad0\" (UID: \"ca2927f7-0512-4059-8f10-487ac6fbbad0\") " Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.602467 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/707152c6-0c94-4f5b-8b1a-0ca318f9fe92-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "707152c6-0c94-4f5b-8b1a-0ca318f9fe92" (UID: "707152c6-0c94-4f5b-8b1a-0ca318f9fe92"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.602514 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca2927f7-0512-4059-8f10-487ac6fbbad0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca2927f7-0512-4059-8f10-487ac6fbbad0" (UID: "ca2927f7-0512-4059-8f10-487ac6fbbad0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.602588 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5257a699-8e01-4460-9764-ae38c984495e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.602603 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/707152c6-0c94-4f5b-8b1a-0ca318f9fe92-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.602612 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/390bc098-2623-4898-b666-0e615bfa815a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.602622 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtf82\" (UniqueName: \"kubernetes.io/projected/5257a699-8e01-4460-9764-ae38c984495e-kube-api-access-qtf82\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.604402 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/390bc098-2623-4898-b666-0e615bfa815a-kube-api-access-kdcqs" (OuterVolumeSpecName: "kube-api-access-kdcqs") pod "390bc098-2623-4898-b666-0e615bfa815a" (UID: "390bc098-2623-4898-b666-0e615bfa815a"). InnerVolumeSpecName "kube-api-access-kdcqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.605438 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707152c6-0c94-4f5b-8b1a-0ca318f9fe92-kube-api-access-6hwvw" (OuterVolumeSpecName: "kube-api-access-6hwvw") pod "707152c6-0c94-4f5b-8b1a-0ca318f9fe92" (UID: "707152c6-0c94-4f5b-8b1a-0ca318f9fe92"). InnerVolumeSpecName "kube-api-access-6hwvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.606020 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca2927f7-0512-4059-8f10-487ac6fbbad0-kube-api-access-l4cd7" (OuterVolumeSpecName: "kube-api-access-l4cd7") pod "ca2927f7-0512-4059-8f10-487ac6fbbad0" (UID: "ca2927f7-0512-4059-8f10-487ac6fbbad0"). InnerVolumeSpecName "kube-api-access-l4cd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.704674 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdcqs\" (UniqueName: \"kubernetes.io/projected/390bc098-2623-4898-b666-0e615bfa815a-kube-api-access-kdcqs\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.705040 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4cd7\" (UniqueName: \"kubernetes.io/projected/ca2927f7-0512-4059-8f10-487ac6fbbad0-kube-api-access-l4cd7\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.705051 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hwvw\" (UniqueName: \"kubernetes.io/projected/707152c6-0c94-4f5b-8b1a-0ca318f9fe92-kube-api-access-6hwvw\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.705064 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca2927f7-0512-4059-8f10-487ac6fbbad0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.846500 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4957-account-create-update-mgqk9" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.846479 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4957-account-create-update-mgqk9" event={"ID":"707152c6-0c94-4f5b-8b1a-0ca318f9fe92","Type":"ContainerDied","Data":"3ff80601570ac9840e1488297303a4257c3408b51c839c66e0b9cd8f17f8050b"} Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.846622 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ff80601570ac9840e1488297303a4257c3408b51c839c66e0b9cd8f17f8050b" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.849181 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rfdgn" event={"ID":"390bc098-2623-4898-b666-0e615bfa815a","Type":"ContainerDied","Data":"c5a6802b491ec28fc66488ad7516c092a194bd44a7b247cfc17df4c929a1ab7e"} Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.849214 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5a6802b491ec28fc66488ad7516c092a194bd44a7b247cfc17df4c929a1ab7e" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.849222 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rfdgn" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.851916 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-980e-account-create-update-cllcd" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.853803 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-980e-account-create-update-cllcd" event={"ID":"5257a699-8e01-4460-9764-ae38c984495e","Type":"ContainerDied","Data":"b8baa54287e8d7f57e07e85f488b902c957291d8ed57aeac590733dbea384ac7"} Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.853876 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8baa54287e8d7f57e07e85f488b902c957291d8ed57aeac590733dbea384ac7" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.854878 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2kw4j" Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.855650 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2kw4j" event={"ID":"ca2927f7-0512-4059-8f10-487ac6fbbad0","Type":"ContainerDied","Data":"c254e833abe9ecae473f8917ed7f8fe6e3c17336d6c98941a5354e680e4f5aca"} Dec 09 11:52:54 crc kubenswrapper[4745]: I1209 11:52:54.855688 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c254e833abe9ecae473f8917ed7f8fe6e3c17336d6c98941a5354e680e4f5aca" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.174848 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.313989 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-log-ovn\") pod \"1c8192c4-a534-4dda-ad08-12a189c53546\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.314100 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-run-ovn\") pod \"1c8192c4-a534-4dda-ad08-12a189c53546\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.314184 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c8192c4-a534-4dda-ad08-12a189c53546-additional-scripts\") pod \"1c8192c4-a534-4dda-ad08-12a189c53546\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.314259 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms6vv\" (UniqueName: \"kubernetes.io/projected/1c8192c4-a534-4dda-ad08-12a189c53546-kube-api-access-ms6vv\") pod \"1c8192c4-a534-4dda-ad08-12a189c53546\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.314313 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c8192c4-a534-4dda-ad08-12a189c53546-scripts\") pod \"1c8192c4-a534-4dda-ad08-12a189c53546\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.314344 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-run\") pod \"1c8192c4-a534-4dda-ad08-12a189c53546\" (UID: \"1c8192c4-a534-4dda-ad08-12a189c53546\") " Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.314799 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1c8192c4-a534-4dda-ad08-12a189c53546" (UID: "1c8192c4-a534-4dda-ad08-12a189c53546"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.314848 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.314860 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1c8192c4-a534-4dda-ad08-12a189c53546" (UID: "1c8192c4-a534-4dda-ad08-12a189c53546"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.315025 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-run" (OuterVolumeSpecName: "var-run") pod "1c8192c4-a534-4dda-ad08-12a189c53546" (UID: "1c8192c4-a534-4dda-ad08-12a189c53546"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.315124 4745 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.315298 4745 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.316617 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c8192c4-a534-4dda-ad08-12a189c53546-scripts" (OuterVolumeSpecName: "scripts") pod "1c8192c4-a534-4dda-ad08-12a189c53546" (UID: "1c8192c4-a534-4dda-ad08-12a189c53546"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.316948 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c8192c4-a534-4dda-ad08-12a189c53546-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1c8192c4-a534-4dda-ad08-12a189c53546" (UID: "1c8192c4-a534-4dda-ad08-12a189c53546"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.322970 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c8192c4-a534-4dda-ad08-12a189c53546-kube-api-access-ms6vv" (OuterVolumeSpecName: "kube-api-access-ms6vv") pod "1c8192c4-a534-4dda-ad08-12a189c53546" (UID: "1c8192c4-a534-4dda-ad08-12a189c53546"). InnerVolumeSpecName "kube-api-access-ms6vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.326386 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift\") pod \"swift-storage-0\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " pod="openstack/swift-storage-0" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.361840 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.417554 4745 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c8192c4-a534-4dda-ad08-12a189c53546-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.417899 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms6vv\" (UniqueName: \"kubernetes.io/projected/1c8192c4-a534-4dda-ad08-12a189c53546-kube-api-access-ms6vv\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.418048 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c8192c4-a534-4dda-ad08-12a189c53546-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.418170 4745 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c8192c4-a534-4dda-ad08-12a189c53546-var-run\") on node \"crc\" DevicePath \"\"" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.865104 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dqr9b-config-swpwg" event={"ID":"1c8192c4-a534-4dda-ad08-12a189c53546","Type":"ContainerDied","Data":"1d35859fcb291bfabe8f891a428294dfee3b4204eb636bb0f166eba9f1e53af9"} Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.865484 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d35859fcb291bfabe8f891a428294dfee3b4204eb636bb0f166eba9f1e53af9" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.865324 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dqr9b-config-swpwg" Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.971816 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dqr9b-config-swpwg"] Dec 09 11:52:55 crc kubenswrapper[4745]: I1209 11:52:55.983858 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dqr9b-config-swpwg"] Dec 09 11:52:56 crc kubenswrapper[4745]: W1209 11:52:56.000589 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71ebc86b_4ef3_4d3f_911f_93036c9ce19b.slice/crio-605f894bbcb8a27461251589d7de2e0029db9035d8ce2c6a92d5182e8747c7d0 WatchSource:0}: Error finding container 605f894bbcb8a27461251589d7de2e0029db9035d8ce2c6a92d5182e8747c7d0: Status 404 returned error can't find the container with id 605f894bbcb8a27461251589d7de2e0029db9035d8ce2c6a92d5182e8747c7d0 Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.001682 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.042951 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dqr9b-config-fxdrp"] Dec 09 11:52:56 crc kubenswrapper[4745]: E1209 11:52:56.043380 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311eddda-625a-4029-ba56-b408b2242eb5" containerName="mariadb-account-create-update" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.043401 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="311eddda-625a-4029-ba56-b408b2242eb5" containerName="mariadb-account-create-update" Dec 09 11:52:56 crc kubenswrapper[4745]: E1209 11:52:56.043423 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390bc098-2623-4898-b666-0e615bfa815a" containerName="mariadb-database-create" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.043433 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="390bc098-2623-4898-b666-0e615bfa815a" containerName="mariadb-database-create" Dec 09 11:52:56 crc kubenswrapper[4745]: E1209 11:52:56.043452 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2927f7-0512-4059-8f10-487ac6fbbad0" containerName="mariadb-database-create" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.043462 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2927f7-0512-4059-8f10-487ac6fbbad0" containerName="mariadb-database-create" Dec 09 11:52:56 crc kubenswrapper[4745]: E1209 11:52:56.043477 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707152c6-0c94-4f5b-8b1a-0ca318f9fe92" containerName="mariadb-account-create-update" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.043487 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="707152c6-0c94-4f5b-8b1a-0ca318f9fe92" containerName="mariadb-account-create-update" Dec 09 11:52:56 crc kubenswrapper[4745]: E1209 11:52:56.043513 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1b624b-3936-4cec-a1ba-b4efa1504020" containerName="mariadb-database-create" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.045660 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1b624b-3936-4cec-a1ba-b4efa1504020" containerName="mariadb-database-create" Dec 09 11:52:56 crc kubenswrapper[4745]: E1209 11:52:56.045723 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8192c4-a534-4dda-ad08-12a189c53546" containerName="ovn-config" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.045734 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8192c4-a534-4dda-ad08-12a189c53546" containerName="ovn-config" Dec 09 11:52:56 crc kubenswrapper[4745]: E1209 11:52:56.045754 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5257a699-8e01-4460-9764-ae38c984495e" containerName="mariadb-account-create-update" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.045761 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="5257a699-8e01-4460-9764-ae38c984495e" containerName="mariadb-account-create-update" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.046111 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="390bc098-2623-4898-b666-0e615bfa815a" containerName="mariadb-database-create" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.046135 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca2927f7-0512-4059-8f10-487ac6fbbad0" containerName="mariadb-database-create" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.046153 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="707152c6-0c94-4f5b-8b1a-0ca318f9fe92" containerName="mariadb-account-create-update" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.046172 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c8192c4-a534-4dda-ad08-12a189c53546" containerName="ovn-config" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.046187 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="311eddda-625a-4029-ba56-b408b2242eb5" containerName="mariadb-account-create-update" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.046202 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1b624b-3936-4cec-a1ba-b4efa1504020" containerName="mariadb-database-create" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.046212 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="5257a699-8e01-4460-9764-ae38c984495e" containerName="mariadb-account-create-update" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.047006 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.052109 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.063227 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dqr9b-config-fxdrp"] Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.133897 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-log-ovn\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.134020 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqcch\" (UniqueName: \"kubernetes.io/projected/d66d6606-e46b-4d0d-af45-e65749212dea-kube-api-access-vqcch\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.134065 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-run\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.134111 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d66d6606-e46b-4d0d-af45-e65749212dea-scripts\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.134176 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d66d6606-e46b-4d0d-af45-e65749212dea-additional-scripts\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.134215 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-run-ovn\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.235793 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqcch\" (UniqueName: \"kubernetes.io/projected/d66d6606-e46b-4d0d-af45-e65749212dea-kube-api-access-vqcch\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.235857 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-run\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.235898 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d66d6606-e46b-4d0d-af45-e65749212dea-scripts\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.235945 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d66d6606-e46b-4d0d-af45-e65749212dea-additional-scripts\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.235980 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-run-ovn\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.236027 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-log-ovn\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.236318 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-log-ovn\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.237113 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d66d6606-e46b-4d0d-af45-e65749212dea-additional-scripts\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.237178 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-run-ovn\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.237219 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-run\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.238491 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d66d6606-e46b-4d0d-af45-e65749212dea-scripts\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.249174 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-pw7nc"] Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.250364 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pw7nc" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.254131 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.254192 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vchrd" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.263247 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqcch\" (UniqueName: \"kubernetes.io/projected/d66d6606-e46b-4d0d-af45-e65749212dea-kube-api-access-vqcch\") pod \"ovn-controller-dqr9b-config-fxdrp\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.265509 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pw7nc"] Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.338796 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-dqr9b" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.339999 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kj79\" (UniqueName: \"kubernetes.io/projected/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-kube-api-access-9kj79\") pod \"glance-db-sync-pw7nc\" (UID: \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\") " pod="openstack/glance-db-sync-pw7nc" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.340752 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-combined-ca-bundle\") pod \"glance-db-sync-pw7nc\" (UID: \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\") " pod="openstack/glance-db-sync-pw7nc" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.340828 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-db-sync-config-data\") pod \"glance-db-sync-pw7nc\" (UID: \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\") " pod="openstack/glance-db-sync-pw7nc" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.340910 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-config-data\") pod \"glance-db-sync-pw7nc\" (UID: \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\") " pod="openstack/glance-db-sync-pw7nc" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.377832 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.442232 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-config-data\") pod \"glance-db-sync-pw7nc\" (UID: \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\") " pod="openstack/glance-db-sync-pw7nc" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.442297 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kj79\" (UniqueName: \"kubernetes.io/projected/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-kube-api-access-9kj79\") pod \"glance-db-sync-pw7nc\" (UID: \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\") " pod="openstack/glance-db-sync-pw7nc" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.442380 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-combined-ca-bundle\") pod \"glance-db-sync-pw7nc\" (UID: \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\") " pod="openstack/glance-db-sync-pw7nc" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.442411 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-db-sync-config-data\") pod \"glance-db-sync-pw7nc\" (UID: \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\") " pod="openstack/glance-db-sync-pw7nc" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.447708 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-combined-ca-bundle\") pod \"glance-db-sync-pw7nc\" (UID: \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\") " pod="openstack/glance-db-sync-pw7nc" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.449151 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-config-data\") pod \"glance-db-sync-pw7nc\" (UID: \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\") " pod="openstack/glance-db-sync-pw7nc" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.453215 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-db-sync-config-data\") pod \"glance-db-sync-pw7nc\" (UID: \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\") " pod="openstack/glance-db-sync-pw7nc" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.471655 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kj79\" (UniqueName: \"kubernetes.io/projected/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-kube-api-access-9kj79\") pod \"glance-db-sync-pw7nc\" (UID: \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\") " pod="openstack/glance-db-sync-pw7nc" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.625659 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pw7nc" Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.852324 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dqr9b-config-fxdrp"] Dec 09 11:52:56 crc kubenswrapper[4745]: I1209 11:52:56.940100 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerStarted","Data":"605f894bbcb8a27461251589d7de2e0029db9035d8ce2c6a92d5182e8747c7d0"} Dec 09 11:52:57 crc kubenswrapper[4745]: I1209 11:52:57.004242 4745 generic.go:334] "Generic (PLEG): container finished" podID="5b860955-30eb-40e6-bd56-caf6098aed8a" containerID="3183cfb66289decc58035a93ed3c6db16f62f8769a1980e9dea7b5da9edf7ad3" exitCode=0 Dec 09 11:52:57 crc kubenswrapper[4745]: I1209 11:52:57.004315 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b860955-30eb-40e6-bd56-caf6098aed8a","Type":"ContainerDied","Data":"3183cfb66289decc58035a93ed3c6db16f62f8769a1980e9dea7b5da9edf7ad3"} Dec 09 11:52:57 crc kubenswrapper[4745]: I1209 11:52:57.570881 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c8192c4-a534-4dda-ad08-12a189c53546" path="/var/lib/kubelet/pods/1c8192c4-a534-4dda-ad08-12a189c53546/volumes" Dec 09 11:52:57 crc kubenswrapper[4745]: I1209 11:52:57.572944 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pw7nc"] Dec 09 11:52:57 crc kubenswrapper[4745]: W1209 11:52:57.581623 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf657ee5_9433_4e1a_9a66_33f59c0e5b0a.slice/crio-522405e50301614f309ce0bac48f6418a6ac2927d286c166b70c8a7d455cd0ac WatchSource:0}: Error finding container 522405e50301614f309ce0bac48f6418a6ac2927d286c166b70c8a7d455cd0ac: Status 404 returned error can't find the container with id 522405e50301614f309ce0bac48f6418a6ac2927d286c166b70c8a7d455cd0ac Dec 09 11:52:58 crc kubenswrapper[4745]: I1209 11:52:58.026796 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dqr9b-config-fxdrp" event={"ID":"d66d6606-e46b-4d0d-af45-e65749212dea","Type":"ContainerStarted","Data":"e0476a986f9634758ad0014851f819c295e0cb2787e1d0cd344530e1064f8968"} Dec 09 11:52:58 crc kubenswrapper[4745]: I1209 11:52:58.027162 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dqr9b-config-fxdrp" event={"ID":"d66d6606-e46b-4d0d-af45-e65749212dea","Type":"ContainerStarted","Data":"84bb5b5fce8e0bb0b5d038a242f37235a28837911e7823496ab282b13097befc"} Dec 09 11:52:58 crc kubenswrapper[4745]: I1209 11:52:58.029353 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pw7nc" event={"ID":"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a","Type":"ContainerStarted","Data":"522405e50301614f309ce0bac48f6418a6ac2927d286c166b70c8a7d455cd0ac"} Dec 09 11:52:58 crc kubenswrapper[4745]: I1209 11:52:58.031909 4745 generic.go:334] "Generic (PLEG): container finished" podID="ceba626e-26d1-495f-b88d-fed69e445ddb" containerID="4b5baf1df440151578f138c2907e5ac84a040ebc7d009362d8e110c6311812ee" exitCode=0 Dec 09 11:52:58 crc kubenswrapper[4745]: I1209 11:52:58.031978 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ceba626e-26d1-495f-b88d-fed69e445ddb","Type":"ContainerDied","Data":"4b5baf1df440151578f138c2907e5ac84a040ebc7d009362d8e110c6311812ee"} Dec 09 11:52:58 crc kubenswrapper[4745]: I1209 11:52:58.035050 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b860955-30eb-40e6-bd56-caf6098aed8a","Type":"ContainerStarted","Data":"31bf789bc4e44645e0b834e648dabfc8f57c6ad93e2976fce50cb6120b8850cf"} Dec 09 11:52:58 crc kubenswrapper[4745]: I1209 11:52:58.035446 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 09 11:52:58 crc kubenswrapper[4745]: I1209 11:52:58.087811 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dqr9b-config-fxdrp" podStartSLOduration=2.08779392 podStartE2EDuration="2.08779392s" podCreationTimestamp="2025-12-09 11:52:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:52:58.060839874 +0000 UTC m=+1264.886041408" watchObservedRunningTime="2025-12-09 11:52:58.08779392 +0000 UTC m=+1264.912995444" Dec 09 11:52:58 crc kubenswrapper[4745]: I1209 11:52:58.088465 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.900015107 podStartE2EDuration="1m23.088458538s" podCreationTimestamp="2025-12-09 11:51:35 +0000 UTC" firstStartedPulling="2025-12-09 11:51:37.763734016 +0000 UTC m=+1184.588935530" lastFinishedPulling="2025-12-09 11:52:22.952177437 +0000 UTC m=+1229.777378961" observedRunningTime="2025-12-09 11:52:58.085138148 +0000 UTC m=+1264.910339682" watchObservedRunningTime="2025-12-09 11:52:58.088458538 +0000 UTC m=+1264.913660062" Dec 09 11:52:59 crc kubenswrapper[4745]: I1209 11:52:59.044488 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerStarted","Data":"9331827add7fc120aedf6b88798a58ce524ffc136ae60e9e6ba61e68c91d8281"} Dec 09 11:52:59 crc kubenswrapper[4745]: I1209 11:52:59.044889 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerStarted","Data":"949bb722de609413bb37a922ca3be73fb6c7a4916d0fe96c8b8b977d069ac462"} Dec 09 11:52:59 crc kubenswrapper[4745]: I1209 11:52:59.046006 4745 generic.go:334] "Generic (PLEG): container finished" podID="d66d6606-e46b-4d0d-af45-e65749212dea" containerID="e0476a986f9634758ad0014851f819c295e0cb2787e1d0cd344530e1064f8968" exitCode=0 Dec 09 11:52:59 crc kubenswrapper[4745]: I1209 11:52:59.046044 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dqr9b-config-fxdrp" event={"ID":"d66d6606-e46b-4d0d-af45-e65749212dea","Type":"ContainerDied","Data":"e0476a986f9634758ad0014851f819c295e0cb2787e1d0cd344530e1064f8968"} Dec 09 11:52:59 crc kubenswrapper[4745]: I1209 11:52:59.049248 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ceba626e-26d1-495f-b88d-fed69e445ddb","Type":"ContainerStarted","Data":"8dae8dcac1defac8c58cb335ba486b55e9b6076bf8990bbbc17743c619b726cb"} Dec 09 11:52:59 crc kubenswrapper[4745]: I1209 11:52:59.049826 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:52:59 crc kubenswrapper[4745]: I1209 11:52:59.118869 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371951.735926 podStartE2EDuration="1m25.118850443s" podCreationTimestamp="2025-12-09 11:51:34 +0000 UTC" firstStartedPulling="2025-12-09 11:51:37.079536013 +0000 UTC m=+1183.904737537" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:52:59.109294906 +0000 UTC m=+1265.934496430" watchObservedRunningTime="2025-12-09 11:52:59.118850443 +0000 UTC m=+1265.944051957" Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.059889 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerStarted","Data":"897333cc8f92ee902c0c7c820dc78dbe5dd0956289175db6c2e6cfd0c73135bd"} Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.060591 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerStarted","Data":"864f05e319634c7f2e50a7ad6d5981f327821fb1cfb3fd4844b58c7d6634eebb"} Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.429793 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.516649 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-log-ovn\") pod \"d66d6606-e46b-4d0d-af45-e65749212dea\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.516775 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d66d6606-e46b-4d0d-af45-e65749212dea" (UID: "d66d6606-e46b-4d0d-af45-e65749212dea"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.517100 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d66d6606-e46b-4d0d-af45-e65749212dea-scripts\") pod \"d66d6606-e46b-4d0d-af45-e65749212dea\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.517217 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqcch\" (UniqueName: \"kubernetes.io/projected/d66d6606-e46b-4d0d-af45-e65749212dea-kube-api-access-vqcch\") pod \"d66d6606-e46b-4d0d-af45-e65749212dea\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.517254 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-run-ovn\") pod \"d66d6606-e46b-4d0d-af45-e65749212dea\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.517354 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-run\") pod \"d66d6606-e46b-4d0d-af45-e65749212dea\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.517385 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d66d6606-e46b-4d0d-af45-e65749212dea-additional-scripts\") pod \"d66d6606-e46b-4d0d-af45-e65749212dea\" (UID: \"d66d6606-e46b-4d0d-af45-e65749212dea\") " Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.517448 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d66d6606-e46b-4d0d-af45-e65749212dea" (UID: "d66d6606-e46b-4d0d-af45-e65749212dea"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.517486 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-run" (OuterVolumeSpecName: "var-run") pod "d66d6606-e46b-4d0d-af45-e65749212dea" (UID: "d66d6606-e46b-4d0d-af45-e65749212dea"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.517999 4745 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.518018 4745 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.518027 4745 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d66d6606-e46b-4d0d-af45-e65749212dea-var-run\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.518356 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d66d6606-e46b-4d0d-af45-e65749212dea-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d66d6606-e46b-4d0d-af45-e65749212dea" (UID: "d66d6606-e46b-4d0d-af45-e65749212dea"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.518546 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d66d6606-e46b-4d0d-af45-e65749212dea-scripts" (OuterVolumeSpecName: "scripts") pod "d66d6606-e46b-4d0d-af45-e65749212dea" (UID: "d66d6606-e46b-4d0d-af45-e65749212dea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.527808 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66d6606-e46b-4d0d-af45-e65749212dea-kube-api-access-vqcch" (OuterVolumeSpecName: "kube-api-access-vqcch") pod "d66d6606-e46b-4d0d-af45-e65749212dea" (UID: "d66d6606-e46b-4d0d-af45-e65749212dea"). InnerVolumeSpecName "kube-api-access-vqcch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.619778 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d66d6606-e46b-4d0d-af45-e65749212dea-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.619819 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqcch\" (UniqueName: \"kubernetes.io/projected/d66d6606-e46b-4d0d-af45-e65749212dea-kube-api-access-vqcch\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:00 crc kubenswrapper[4745]: I1209 11:53:00.619830 4745 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d66d6606-e46b-4d0d-af45-e65749212dea-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:01 crc kubenswrapper[4745]: I1209 11:53:01.084847 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dqr9b-config-fxdrp" event={"ID":"d66d6606-e46b-4d0d-af45-e65749212dea","Type":"ContainerDied","Data":"84bb5b5fce8e0bb0b5d038a242f37235a28837911e7823496ab282b13097befc"} Dec 09 11:53:01 crc kubenswrapper[4745]: I1209 11:53:01.084925 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84bb5b5fce8e0bb0b5d038a242f37235a28837911e7823496ab282b13097befc" Dec 09 11:53:01 crc kubenswrapper[4745]: I1209 11:53:01.084941 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dqr9b-config-fxdrp" Dec 09 11:53:01 crc kubenswrapper[4745]: I1209 11:53:01.146033 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dqr9b-config-fxdrp"] Dec 09 11:53:01 crc kubenswrapper[4745]: I1209 11:53:01.164072 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dqr9b-config-fxdrp"] Dec 09 11:53:01 crc kubenswrapper[4745]: I1209 11:53:01.580717 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d66d6606-e46b-4d0d-af45-e65749212dea" path="/var/lib/kubelet/pods/d66d6606-e46b-4d0d-af45-e65749212dea/volumes" Dec 09 11:53:02 crc kubenswrapper[4745]: I1209 11:53:02.105247 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerStarted","Data":"d1c5baae9ca212385d6390766968bb5c5c817b622698a6ebd63ce2e07c088b11"} Dec 09 11:53:02 crc kubenswrapper[4745]: I1209 11:53:02.106587 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerStarted","Data":"159077530d8b2bbf20cfe2054dbf825f2d68d49f59c36639bd872289904ce3f6"} Dec 09 11:53:02 crc kubenswrapper[4745]: E1209 11:53:02.123168 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod311eddda_625a_4029_ba56_b408b2242eb5.slice/crio-ff9a5a6e802cf0168d9d01d325c9e7da33cbb8d5f307314d40c85aa0596d29c7\": RecentStats: unable to find data in memory cache]" Dec 09 11:53:03 crc kubenswrapper[4745]: I1209 11:53:03.118484 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerStarted","Data":"c4b12f433424215b3190a84a7a70334fc6b3243836fe4b1b5a6c2aecc6c4f26b"} Dec 09 11:53:03 crc kubenswrapper[4745]: I1209 11:53:03.118826 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerStarted","Data":"affba02167b7f1a39157024b3315fc6007c26612f881997d6a5c2ee7f8c3a031"} Dec 09 11:53:04 crc kubenswrapper[4745]: I1209 11:53:04.136705 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerStarted","Data":"452aca3b6e3787e1a6aba6c354db611a2345a39e27b0e120f6ddf7586fe81934"} Dec 09 11:53:05 crc kubenswrapper[4745]: I1209 11:53:05.157369 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerStarted","Data":"7b15301db669e893c5e60abe636eaa9c44f3893adf6b79d1697d2915c154c463"} Dec 09 11:53:06 crc kubenswrapper[4745]: I1209 11:53:06.182633 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerStarted","Data":"efabc77007bbbd62d845864253d28ab16751a3875bb948814d9433ce5750dc12"} Dec 09 11:53:12 crc kubenswrapper[4745]: E1209 11:53:12.359655 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod311eddda_625a_4029_ba56_b408b2242eb5.slice/crio-ff9a5a6e802cf0168d9d01d325c9e7da33cbb8d5f307314d40c85aa0596d29c7\": RecentStats: unable to find data in memory cache]" Dec 09 11:53:14 crc kubenswrapper[4745]: I1209 11:53:14.291560 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerStarted","Data":"3b09c85bd7e17fb217f98509b4b794244fe5e3f0c26306c7c690e6a051489972"} Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.305066 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerStarted","Data":"da72e5c3801efd33947357e2fd984868e5b4f23df51fbc04f57825d0c946a003"} Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.305119 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerStarted","Data":"2bef905b24c86bdd149cc3b8f821b124b807b8515c256b0a8c7c6263b9eedcd7"} Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.305129 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerStarted","Data":"a09eef6ded4878a8e32d6ba9cbd3dd19b6b7bfeec260ad7cfb501f71b11f4620"} Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.308406 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pw7nc" event={"ID":"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a","Type":"ContainerStarted","Data":"67cf2989b9dc0e9ab08763199525684b72ba2bee0348375259236deaa1092c59"} Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.356841 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=45.413653774 podStartE2EDuration="53.356819526s" podCreationTimestamp="2025-12-09 11:52:22 +0000 UTC" firstStartedPulling="2025-12-09 11:52:56.00285339 +0000 UTC m=+1262.828054914" lastFinishedPulling="2025-12-09 11:53:03.946019142 +0000 UTC m=+1270.771220666" observedRunningTime="2025-12-09 11:53:15.349487529 +0000 UTC m=+1282.174689063" watchObservedRunningTime="2025-12-09 11:53:15.356819526 +0000 UTC m=+1282.182021050" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.378168 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-pw7nc" podStartSLOduration=2.865124079 podStartE2EDuration="19.37814345s" podCreationTimestamp="2025-12-09 11:52:56 +0000 UTC" firstStartedPulling="2025-12-09 11:52:57.583969043 +0000 UTC m=+1264.409170577" lastFinishedPulling="2025-12-09 11:53:14.096988424 +0000 UTC m=+1280.922189948" observedRunningTime="2025-12-09 11:53:15.373035213 +0000 UTC m=+1282.198236747" watchObservedRunningTime="2025-12-09 11:53:15.37814345 +0000 UTC m=+1282.203344974" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.647583 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bdffd66f-chqjr"] Dec 09 11:53:15 crc kubenswrapper[4745]: E1209 11:53:15.648225 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66d6606-e46b-4d0d-af45-e65749212dea" containerName="ovn-config" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.648246 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66d6606-e46b-4d0d-af45-e65749212dea" containerName="ovn-config" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.648487 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66d6606-e46b-4d0d-af45-e65749212dea" containerName="ovn-config" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.649642 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.651650 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.661873 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bdffd66f-chqjr"] Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.799402 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-config\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.799457 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnz46\" (UniqueName: \"kubernetes.io/projected/484908d8-e648-4fa4-954b-490f2f06ebb6-kube-api-access-pnz46\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.799523 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-dns-swift-storage-0\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.799753 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-ovsdbserver-nb\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.799831 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-ovsdbserver-sb\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.800019 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-dns-svc\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.902055 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-dns-svc\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.902150 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-config\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.902176 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnz46\" (UniqueName: \"kubernetes.io/projected/484908d8-e648-4fa4-954b-490f2f06ebb6-kube-api-access-pnz46\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.902224 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-dns-swift-storage-0\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.902258 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-ovsdbserver-nb\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.902297 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-ovsdbserver-sb\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.903193 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-dns-svc\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.903275 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-dns-swift-storage-0\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.903290 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-config\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.903367 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-ovsdbserver-sb\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.903583 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-ovsdbserver-nb\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.930418 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnz46\" (UniqueName: \"kubernetes.io/projected/484908d8-e648-4fa4-954b-490f2f06ebb6-kube-api-access-pnz46\") pod \"dnsmasq-dns-75bdffd66f-chqjr\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:15 crc kubenswrapper[4745]: I1209 11:53:15.971313 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:16 crc kubenswrapper[4745]: I1209 11:53:16.356742 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:53:16 crc kubenswrapper[4745]: I1209 11:53:16.456753 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bdffd66f-chqjr"] Dec 09 11:53:16 crc kubenswrapper[4745]: W1209 11:53:16.467824 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod484908d8_e648_4fa4_954b_490f2f06ebb6.slice/crio-2bced92cf5e31be9894291401d8805cfb36d58fc8fd755a9a68a522600166e53 WatchSource:0}: Error finding container 2bced92cf5e31be9894291401d8805cfb36d58fc8fd755a9a68a522600166e53: Status 404 returned error can't find the container with id 2bced92cf5e31be9894291401d8805cfb36d58fc8fd755a9a68a522600166e53 Dec 09 11:53:16 crc kubenswrapper[4745]: I1209 11:53:16.836774 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 09 11:53:17 crc kubenswrapper[4745]: I1209 11:53:17.330723 4745 generic.go:334] "Generic (PLEG): container finished" podID="484908d8-e648-4fa4-954b-490f2f06ebb6" containerID="b28c2d59d8afd69dd3c2c32acf8a0fb3dae8874a168f38806487c53880473616" exitCode=0 Dec 09 11:53:17 crc kubenswrapper[4745]: I1209 11:53:17.330818 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" event={"ID":"484908d8-e648-4fa4-954b-490f2f06ebb6","Type":"ContainerDied","Data":"b28c2d59d8afd69dd3c2c32acf8a0fb3dae8874a168f38806487c53880473616"} Dec 09 11:53:17 crc kubenswrapper[4745]: I1209 11:53:17.331207 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" event={"ID":"484908d8-e648-4fa4-954b-490f2f06ebb6","Type":"ContainerStarted","Data":"2bced92cf5e31be9894291401d8805cfb36d58fc8fd755a9a68a522600166e53"} Dec 09 11:53:18 crc kubenswrapper[4745]: I1209 11:53:18.920861 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-xl2mt"] Dec 09 11:53:18 crc kubenswrapper[4745]: I1209 11:53:18.924308 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xl2mt" Dec 09 11:53:18 crc kubenswrapper[4745]: I1209 11:53:18.953413 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xl2mt"] Dec 09 11:53:18 crc kubenswrapper[4745]: I1209 11:53:18.981250 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0683710-3d66-4128-981a-227590aa97a0-operator-scripts\") pod \"cinder-db-create-xl2mt\" (UID: \"d0683710-3d66-4128-981a-227590aa97a0\") " pod="openstack/cinder-db-create-xl2mt" Dec 09 11:53:18 crc kubenswrapper[4745]: I1209 11:53:18.981366 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zsp5\" (UniqueName: \"kubernetes.io/projected/d0683710-3d66-4128-981a-227590aa97a0-kube-api-access-9zsp5\") pod \"cinder-db-create-xl2mt\" (UID: \"d0683710-3d66-4128-981a-227590aa97a0\") " pod="openstack/cinder-db-create-xl2mt" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.029148 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-qh9lk"] Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.030395 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qh9lk" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.052786 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qh9lk"] Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.083085 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0683710-3d66-4128-981a-227590aa97a0-operator-scripts\") pod \"cinder-db-create-xl2mt\" (UID: \"d0683710-3d66-4128-981a-227590aa97a0\") " pod="openstack/cinder-db-create-xl2mt" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.083204 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zsp5\" (UniqueName: \"kubernetes.io/projected/d0683710-3d66-4128-981a-227590aa97a0-kube-api-access-9zsp5\") pod \"cinder-db-create-xl2mt\" (UID: \"d0683710-3d66-4128-981a-227590aa97a0\") " pod="openstack/cinder-db-create-xl2mt" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.083890 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0683710-3d66-4128-981a-227590aa97a0-operator-scripts\") pod \"cinder-db-create-xl2mt\" (UID: \"d0683710-3d66-4128-981a-227590aa97a0\") " pod="openstack/cinder-db-create-xl2mt" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.114143 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zsp5\" (UniqueName: \"kubernetes.io/projected/d0683710-3d66-4128-981a-227590aa97a0-kube-api-access-9zsp5\") pod \"cinder-db-create-xl2mt\" (UID: \"d0683710-3d66-4128-981a-227590aa97a0\") " pod="openstack/cinder-db-create-xl2mt" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.178392 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-686d-account-create-update-lkjxb"] Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.179828 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-686d-account-create-update-lkjxb" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.185406 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szcgh\" (UniqueName: \"kubernetes.io/projected/753eb7ac-080d-4a10-9bba-e3ed44d80985-kube-api-access-szcgh\") pod \"barbican-db-create-qh9lk\" (UID: \"753eb7ac-080d-4a10-9bba-e3ed44d80985\") " pod="openstack/barbican-db-create-qh9lk" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.185691 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/753eb7ac-080d-4a10-9bba-e3ed44d80985-operator-scripts\") pod \"barbican-db-create-qh9lk\" (UID: \"753eb7ac-080d-4a10-9bba-e3ed44d80985\") " pod="openstack/barbican-db-create-qh9lk" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.189338 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.192878 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-686d-account-create-update-lkjxb"] Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.244902 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xl2mt" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.246809 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-83f7-account-create-update-t6hf9"] Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.248006 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-83f7-account-create-update-t6hf9" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.250209 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.266276 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-83f7-account-create-update-t6hf9"] Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.290009 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szcgh\" (UniqueName: \"kubernetes.io/projected/753eb7ac-080d-4a10-9bba-e3ed44d80985-kube-api-access-szcgh\") pod \"barbican-db-create-qh9lk\" (UID: \"753eb7ac-080d-4a10-9bba-e3ed44d80985\") " pod="openstack/barbican-db-create-qh9lk" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.290205 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nckp\" (UniqueName: \"kubernetes.io/projected/1b254d08-6ebd-492e-b955-afc9bed1b627-kube-api-access-8nckp\") pod \"cinder-686d-account-create-update-lkjxb\" (UID: \"1b254d08-6ebd-492e-b955-afc9bed1b627\") " pod="openstack/cinder-686d-account-create-update-lkjxb" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.290245 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/753eb7ac-080d-4a10-9bba-e3ed44d80985-operator-scripts\") pod \"barbican-db-create-qh9lk\" (UID: \"753eb7ac-080d-4a10-9bba-e3ed44d80985\") " pod="openstack/barbican-db-create-qh9lk" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.290335 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b254d08-6ebd-492e-b955-afc9bed1b627-operator-scripts\") pod \"cinder-686d-account-create-update-lkjxb\" (UID: \"1b254d08-6ebd-492e-b955-afc9bed1b627\") " pod="openstack/cinder-686d-account-create-update-lkjxb" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.291847 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/753eb7ac-080d-4a10-9bba-e3ed44d80985-operator-scripts\") pod \"barbican-db-create-qh9lk\" (UID: \"753eb7ac-080d-4a10-9bba-e3ed44d80985\") " pod="openstack/barbican-db-create-qh9lk" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.328860 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szcgh\" (UniqueName: \"kubernetes.io/projected/753eb7ac-080d-4a10-9bba-e3ed44d80985-kube-api-access-szcgh\") pod \"barbican-db-create-qh9lk\" (UID: \"753eb7ac-080d-4a10-9bba-e3ed44d80985\") " pod="openstack/barbican-db-create-qh9lk" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.349201 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qh9lk" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.353235 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" event={"ID":"484908d8-e648-4fa4-954b-490f2f06ebb6","Type":"ContainerStarted","Data":"a3201869945e2890e32a0e743f4f04aa015cf684636a1287e3cdf9f3358fc160"} Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.354260 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.391993 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nckp\" (UniqueName: \"kubernetes.io/projected/1b254d08-6ebd-492e-b955-afc9bed1b627-kube-api-access-8nckp\") pod \"cinder-686d-account-create-update-lkjxb\" (UID: \"1b254d08-6ebd-492e-b955-afc9bed1b627\") " pod="openstack/cinder-686d-account-create-update-lkjxb" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.392047 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12417d7c-b1ee-406f-b60c-415562c15782-operator-scripts\") pod \"barbican-83f7-account-create-update-t6hf9\" (UID: \"12417d7c-b1ee-406f-b60c-415562c15782\") " pod="openstack/barbican-83f7-account-create-update-t6hf9" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.392099 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shpxz\" (UniqueName: \"kubernetes.io/projected/12417d7c-b1ee-406f-b60c-415562c15782-kube-api-access-shpxz\") pod \"barbican-83f7-account-create-update-t6hf9\" (UID: \"12417d7c-b1ee-406f-b60c-415562c15782\") " pod="openstack/barbican-83f7-account-create-update-t6hf9" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.392129 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b254d08-6ebd-492e-b955-afc9bed1b627-operator-scripts\") pod \"cinder-686d-account-create-update-lkjxb\" (UID: \"1b254d08-6ebd-492e-b955-afc9bed1b627\") " pod="openstack/cinder-686d-account-create-update-lkjxb" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.392890 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b254d08-6ebd-492e-b955-afc9bed1b627-operator-scripts\") pod \"cinder-686d-account-create-update-lkjxb\" (UID: \"1b254d08-6ebd-492e-b955-afc9bed1b627\") " pod="openstack/cinder-686d-account-create-update-lkjxb" Dec 09 11:53:19 crc kubenswrapper[4745]: I1209 11:53:19.393109 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" podStartSLOduration=4.393090799 podStartE2EDuration="4.393090799s" podCreationTimestamp="2025-12-09 11:53:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:53:19.376106632 +0000 UTC m=+1286.201308156" watchObservedRunningTime="2025-12-09 11:53:19.393090799 +0000 UTC m=+1286.218292323" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.434970 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nckp\" (UniqueName: \"kubernetes.io/projected/1b254d08-6ebd-492e-b955-afc9bed1b627-kube-api-access-8nckp\") pod \"cinder-686d-account-create-update-lkjxb\" (UID: \"1b254d08-6ebd-492e-b955-afc9bed1b627\") " pod="openstack/cinder-686d-account-create-update-lkjxb" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.472414 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-4966q"] Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.479940 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4966q" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.493748 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4966q"] Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.499061 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12417d7c-b1ee-406f-b60c-415562c15782-operator-scripts\") pod \"barbican-83f7-account-create-update-t6hf9\" (UID: \"12417d7c-b1ee-406f-b60c-415562c15782\") " pod="openstack/barbican-83f7-account-create-update-t6hf9" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.499142 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shpxz\" (UniqueName: \"kubernetes.io/projected/12417d7c-b1ee-406f-b60c-415562c15782-kube-api-access-shpxz\") pod \"barbican-83f7-account-create-update-t6hf9\" (UID: \"12417d7c-b1ee-406f-b60c-415562c15782\") " pod="openstack/barbican-83f7-account-create-update-t6hf9" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.502820 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12417d7c-b1ee-406f-b60c-415562c15782-operator-scripts\") pod \"barbican-83f7-account-create-update-t6hf9\" (UID: \"12417d7c-b1ee-406f-b60c-415562c15782\") " pod="openstack/barbican-83f7-account-create-update-t6hf9" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.503005 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-686d-account-create-update-lkjxb" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.514534 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-p9tff"] Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.516471 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p9tff" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.540722 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.541236 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.542058 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fh2d2" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.544129 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.552593 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shpxz\" (UniqueName: \"kubernetes.io/projected/12417d7c-b1ee-406f-b60c-415562c15782-kube-api-access-shpxz\") pod \"barbican-83f7-account-create-update-t6hf9\" (UID: \"12417d7c-b1ee-406f-b60c-415562c15782\") " pod="openstack/barbican-83f7-account-create-update-t6hf9" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.596801 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-p9tff"] Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.600659 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5q4x\" (UniqueName: \"kubernetes.io/projected/00d4e762-f4d9-4409-9051-d396157c0e90-kube-api-access-m5q4x\") pod \"neutron-db-create-4966q\" (UID: \"00d4e762-f4d9-4409-9051-d396157c0e90\") " pod="openstack/neutron-db-create-4966q" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.600760 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7tnp\" (UniqueName: \"kubernetes.io/projected/29c95beb-2818-4a46-813c-b53fedce2a59-kube-api-access-s7tnp\") pod \"keystone-db-sync-p9tff\" (UID: \"29c95beb-2818-4a46-813c-b53fedce2a59\") " pod="openstack/keystone-db-sync-p9tff" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.600849 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c95beb-2818-4a46-813c-b53fedce2a59-config-data\") pod \"keystone-db-sync-p9tff\" (UID: \"29c95beb-2818-4a46-813c-b53fedce2a59\") " pod="openstack/keystone-db-sync-p9tff" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.600994 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c95beb-2818-4a46-813c-b53fedce2a59-combined-ca-bundle\") pod \"keystone-db-sync-p9tff\" (UID: \"29c95beb-2818-4a46-813c-b53fedce2a59\") " pod="openstack/keystone-db-sync-p9tff" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.601024 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00d4e762-f4d9-4409-9051-d396157c0e90-operator-scripts\") pod \"neutron-db-create-4966q\" (UID: \"00d4e762-f4d9-4409-9051-d396157c0e90\") " pod="openstack/neutron-db-create-4966q" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.618111 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1fc4-account-create-update-qwf8c"] Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.619995 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1fc4-account-create-update-qwf8c" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.626286 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.628673 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1fc4-account-create-update-qwf8c"] Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.702370 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2319532e-1ba5-4616-8363-16f109438bd8-operator-scripts\") pod \"neutron-1fc4-account-create-update-qwf8c\" (UID: \"2319532e-1ba5-4616-8363-16f109438bd8\") " pod="openstack/neutron-1fc4-account-create-update-qwf8c" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.702424 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9mpj\" (UniqueName: \"kubernetes.io/projected/2319532e-1ba5-4616-8363-16f109438bd8-kube-api-access-v9mpj\") pod \"neutron-1fc4-account-create-update-qwf8c\" (UID: \"2319532e-1ba5-4616-8363-16f109438bd8\") " pod="openstack/neutron-1fc4-account-create-update-qwf8c" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.702547 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c95beb-2818-4a46-813c-b53fedce2a59-combined-ca-bundle\") pod \"keystone-db-sync-p9tff\" (UID: \"29c95beb-2818-4a46-813c-b53fedce2a59\") " pod="openstack/keystone-db-sync-p9tff" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.702580 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00d4e762-f4d9-4409-9051-d396157c0e90-operator-scripts\") pod \"neutron-db-create-4966q\" (UID: \"00d4e762-f4d9-4409-9051-d396157c0e90\") " pod="openstack/neutron-db-create-4966q" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.702668 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5q4x\" (UniqueName: \"kubernetes.io/projected/00d4e762-f4d9-4409-9051-d396157c0e90-kube-api-access-m5q4x\") pod \"neutron-db-create-4966q\" (UID: \"00d4e762-f4d9-4409-9051-d396157c0e90\") " pod="openstack/neutron-db-create-4966q" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.702712 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7tnp\" (UniqueName: \"kubernetes.io/projected/29c95beb-2818-4a46-813c-b53fedce2a59-kube-api-access-s7tnp\") pod \"keystone-db-sync-p9tff\" (UID: \"29c95beb-2818-4a46-813c-b53fedce2a59\") " pod="openstack/keystone-db-sync-p9tff" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.702796 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c95beb-2818-4a46-813c-b53fedce2a59-config-data\") pod \"keystone-db-sync-p9tff\" (UID: \"29c95beb-2818-4a46-813c-b53fedce2a59\") " pod="openstack/keystone-db-sync-p9tff" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.703232 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00d4e762-f4d9-4409-9051-d396157c0e90-operator-scripts\") pod \"neutron-db-create-4966q\" (UID: \"00d4e762-f4d9-4409-9051-d396157c0e90\") " pod="openstack/neutron-db-create-4966q" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.716335 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c95beb-2818-4a46-813c-b53fedce2a59-config-data\") pod \"keystone-db-sync-p9tff\" (UID: \"29c95beb-2818-4a46-813c-b53fedce2a59\") " pod="openstack/keystone-db-sync-p9tff" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.724092 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c95beb-2818-4a46-813c-b53fedce2a59-combined-ca-bundle\") pod \"keystone-db-sync-p9tff\" (UID: \"29c95beb-2818-4a46-813c-b53fedce2a59\") " pod="openstack/keystone-db-sync-p9tff" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.746706 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7tnp\" (UniqueName: \"kubernetes.io/projected/29c95beb-2818-4a46-813c-b53fedce2a59-kube-api-access-s7tnp\") pod \"keystone-db-sync-p9tff\" (UID: \"29c95beb-2818-4a46-813c-b53fedce2a59\") " pod="openstack/keystone-db-sync-p9tff" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.754913 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-83f7-account-create-update-t6hf9" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.759377 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5q4x\" (UniqueName: \"kubernetes.io/projected/00d4e762-f4d9-4409-9051-d396157c0e90-kube-api-access-m5q4x\") pod \"neutron-db-create-4966q\" (UID: \"00d4e762-f4d9-4409-9051-d396157c0e90\") " pod="openstack/neutron-db-create-4966q" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.804419 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2319532e-1ba5-4616-8363-16f109438bd8-operator-scripts\") pod \"neutron-1fc4-account-create-update-qwf8c\" (UID: \"2319532e-1ba5-4616-8363-16f109438bd8\") " pod="openstack/neutron-1fc4-account-create-update-qwf8c" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.804464 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9mpj\" (UniqueName: \"kubernetes.io/projected/2319532e-1ba5-4616-8363-16f109438bd8-kube-api-access-v9mpj\") pod \"neutron-1fc4-account-create-update-qwf8c\" (UID: \"2319532e-1ba5-4616-8363-16f109438bd8\") " pod="openstack/neutron-1fc4-account-create-update-qwf8c" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.805642 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2319532e-1ba5-4616-8363-16f109438bd8-operator-scripts\") pod \"neutron-1fc4-account-create-update-qwf8c\" (UID: \"2319532e-1ba5-4616-8363-16f109438bd8\") " pod="openstack/neutron-1fc4-account-create-update-qwf8c" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.828329 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9mpj\" (UniqueName: \"kubernetes.io/projected/2319532e-1ba5-4616-8363-16f109438bd8-kube-api-access-v9mpj\") pod \"neutron-1fc4-account-create-update-qwf8c\" (UID: \"2319532e-1ba5-4616-8363-16f109438bd8\") " pod="openstack/neutron-1fc4-account-create-update-qwf8c" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.850721 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4966q" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.861912 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p9tff" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:19.955859 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1fc4-account-create-update-qwf8c" Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:20.714077 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-83f7-account-create-update-t6hf9"] Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:20.736060 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1fc4-account-create-update-qwf8c"] Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:20.746736 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qh9lk"] Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:20.755790 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xl2mt"] Dec 09 11:53:20 crc kubenswrapper[4745]: W1209 11:53:20.756570 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0683710_3d66_4128_981a_227590aa97a0.slice/crio-a21d42fd6579828e1f8e2495dfa1caef7cfa51d42d171b12c595de494d4e2221 WatchSource:0}: Error finding container a21d42fd6579828e1f8e2495dfa1caef7cfa51d42d171b12c595de494d4e2221: Status 404 returned error can't find the container with id a21d42fd6579828e1f8e2495dfa1caef7cfa51d42d171b12c595de494d4e2221 Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:20.763573 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-686d-account-create-update-lkjxb"] Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:20.771826 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4966q"] Dec 09 11:53:20 crc kubenswrapper[4745]: I1209 11:53:20.779030 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-p9tff"] Dec 09 11:53:20 crc kubenswrapper[4745]: W1209 11:53:20.790972 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00d4e762_f4d9_4409_9051_d396157c0e90.slice/crio-c6e4c11dc5739b4ddc2a4dcd76b36e54c14e39ad93bd1c482459d33bce9c2f25 WatchSource:0}: Error finding container c6e4c11dc5739b4ddc2a4dcd76b36e54c14e39ad93bd1c482459d33bce9c2f25: Status 404 returned error can't find the container with id c6e4c11dc5739b4ddc2a4dcd76b36e54c14e39ad93bd1c482459d33bce9c2f25 Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.378710 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p9tff" event={"ID":"29c95beb-2818-4a46-813c-b53fedce2a59","Type":"ContainerStarted","Data":"8f13d0f13b9ccb652de7ae162e32ffa14aa9bd0d58f2b8d119a63bcc688420d7"} Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.406383 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xl2mt" event={"ID":"d0683710-3d66-4128-981a-227590aa97a0","Type":"ContainerStarted","Data":"b9e81156e03227008305a4610edbd9b02d41542054e6710a627d02509756b16f"} Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.406473 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xl2mt" event={"ID":"d0683710-3d66-4128-981a-227590aa97a0","Type":"ContainerStarted","Data":"a21d42fd6579828e1f8e2495dfa1caef7cfa51d42d171b12c595de494d4e2221"} Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.433093 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-83f7-account-create-update-t6hf9" event={"ID":"12417d7c-b1ee-406f-b60c-415562c15782","Type":"ContainerStarted","Data":"005bbf69bfe8e201237c0b32380a847d1254e598d36321c440e5530aef3a86f2"} Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.433158 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-83f7-account-create-update-t6hf9" event={"ID":"12417d7c-b1ee-406f-b60c-415562c15782","Type":"ContainerStarted","Data":"a3dc9859db8b775874f0b0b1880e962969ac34bae8706c6a5bb9b5f9de695c4b"} Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.435637 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-xl2mt" podStartSLOduration=3.435604757 podStartE2EDuration="3.435604757s" podCreationTimestamp="2025-12-09 11:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:53:21.433924952 +0000 UTC m=+1288.259126476" watchObservedRunningTime="2025-12-09 11:53:21.435604757 +0000 UTC m=+1288.260806281" Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.440267 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qh9lk" event={"ID":"753eb7ac-080d-4a10-9bba-e3ed44d80985","Type":"ContainerStarted","Data":"0885f35be48b9edafa3024a22995fb186c052c7538483ed9d73e519b4bd4b0f3"} Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.440328 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qh9lk" event={"ID":"753eb7ac-080d-4a10-9bba-e3ed44d80985","Type":"ContainerStarted","Data":"af7e6c2b1feb635cae91f168d18fc988de3aea199e3db2ed3d9998cfc09c0942"} Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.457820 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4966q" event={"ID":"00d4e762-f4d9-4409-9051-d396157c0e90","Type":"ContainerStarted","Data":"e1b7bff6592b36237dc72ae840ee30fdfd4ffc88f26ccd26119c13ab09e03659"} Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.457877 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4966q" event={"ID":"00d4e762-f4d9-4409-9051-d396157c0e90","Type":"ContainerStarted","Data":"c6e4c11dc5739b4ddc2a4dcd76b36e54c14e39ad93bd1c482459d33bce9c2f25"} Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.468491 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-686d-account-create-update-lkjxb" event={"ID":"1b254d08-6ebd-492e-b955-afc9bed1b627","Type":"ContainerStarted","Data":"b605e8f019c64255474366b2258f98198c1a71d83e50c329de38b59dbe8bf725"} Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.468561 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-686d-account-create-update-lkjxb" event={"ID":"1b254d08-6ebd-492e-b955-afc9bed1b627","Type":"ContainerStarted","Data":"36aff8ee534f56271365c29bec298c7b7ce859ba464b31e35dd3b6f157b17c30"} Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.475159 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1fc4-account-create-update-qwf8c" event={"ID":"2319532e-1ba5-4616-8363-16f109438bd8","Type":"ContainerStarted","Data":"1f74f4a21b2673333af47e87beb31b19b062f5c33852b85486c3138b937f1825"} Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.475200 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1fc4-account-create-update-qwf8c" event={"ID":"2319532e-1ba5-4616-8363-16f109438bd8","Type":"ContainerStarted","Data":"0203c33a4d9d2e6fc6b10c1e0b6227f5c61cbbe31209f0a79febd75533e31690"} Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.487878 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-83f7-account-create-update-t6hf9" podStartSLOduration=2.487851844 podStartE2EDuration="2.487851844s" podCreationTimestamp="2025-12-09 11:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:53:21.456657344 +0000 UTC m=+1288.281858888" watchObservedRunningTime="2025-12-09 11:53:21.487851844 +0000 UTC m=+1288.313053358" Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.491386 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-qh9lk" podStartSLOduration=2.491378449 podStartE2EDuration="2.491378449s" podCreationTimestamp="2025-12-09 11:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:53:21.475622865 +0000 UTC m=+1288.300824389" watchObservedRunningTime="2025-12-09 11:53:21.491378449 +0000 UTC m=+1288.316579973" Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.497433 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-686d-account-create-update-lkjxb" podStartSLOduration=2.497413502 podStartE2EDuration="2.497413502s" podCreationTimestamp="2025-12-09 11:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:53:21.495261514 +0000 UTC m=+1288.320463038" watchObservedRunningTime="2025-12-09 11:53:21.497413502 +0000 UTC m=+1288.322615026" Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.515469 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-1fc4-account-create-update-qwf8c" podStartSLOduration=2.515450147 podStartE2EDuration="2.515450147s" podCreationTimestamp="2025-12-09 11:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:53:21.509658061 +0000 UTC m=+1288.334859585" watchObservedRunningTime="2025-12-09 11:53:21.515450147 +0000 UTC m=+1288.340651661" Dec 09 11:53:21 crc kubenswrapper[4745]: I1209 11:53:21.553016 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-4966q" podStartSLOduration=2.552963387 podStartE2EDuration="2.552963387s" podCreationTimestamp="2025-12-09 11:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:53:21.525380365 +0000 UTC m=+1288.350581889" watchObservedRunningTime="2025-12-09 11:53:21.552963387 +0000 UTC m=+1288.378164911" Dec 09 11:53:22 crc kubenswrapper[4745]: I1209 11:53:22.485365 4745 generic.go:334] "Generic (PLEG): container finished" podID="1b254d08-6ebd-492e-b955-afc9bed1b627" containerID="b605e8f019c64255474366b2258f98198c1a71d83e50c329de38b59dbe8bf725" exitCode=0 Dec 09 11:53:22 crc kubenswrapper[4745]: I1209 11:53:22.485477 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-686d-account-create-update-lkjxb" event={"ID":"1b254d08-6ebd-492e-b955-afc9bed1b627","Type":"ContainerDied","Data":"b605e8f019c64255474366b2258f98198c1a71d83e50c329de38b59dbe8bf725"} Dec 09 11:53:22 crc kubenswrapper[4745]: I1209 11:53:22.487068 4745 generic.go:334] "Generic (PLEG): container finished" podID="2319532e-1ba5-4616-8363-16f109438bd8" containerID="1f74f4a21b2673333af47e87beb31b19b062f5c33852b85486c3138b937f1825" exitCode=0 Dec 09 11:53:22 crc kubenswrapper[4745]: I1209 11:53:22.487121 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1fc4-account-create-update-qwf8c" event={"ID":"2319532e-1ba5-4616-8363-16f109438bd8","Type":"ContainerDied","Data":"1f74f4a21b2673333af47e87beb31b19b062f5c33852b85486c3138b937f1825"} Dec 09 11:53:22 crc kubenswrapper[4745]: I1209 11:53:22.490625 4745 generic.go:334] "Generic (PLEG): container finished" podID="d0683710-3d66-4128-981a-227590aa97a0" containerID="b9e81156e03227008305a4610edbd9b02d41542054e6710a627d02509756b16f" exitCode=0 Dec 09 11:53:22 crc kubenswrapper[4745]: I1209 11:53:22.490726 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xl2mt" event={"ID":"d0683710-3d66-4128-981a-227590aa97a0","Type":"ContainerDied","Data":"b9e81156e03227008305a4610edbd9b02d41542054e6710a627d02509756b16f"} Dec 09 11:53:22 crc kubenswrapper[4745]: I1209 11:53:22.492582 4745 generic.go:334] "Generic (PLEG): container finished" podID="12417d7c-b1ee-406f-b60c-415562c15782" containerID="005bbf69bfe8e201237c0b32380a847d1254e598d36321c440e5530aef3a86f2" exitCode=0 Dec 09 11:53:22 crc kubenswrapper[4745]: I1209 11:53:22.492672 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-83f7-account-create-update-t6hf9" event={"ID":"12417d7c-b1ee-406f-b60c-415562c15782","Type":"ContainerDied","Data":"005bbf69bfe8e201237c0b32380a847d1254e598d36321c440e5530aef3a86f2"} Dec 09 11:53:22 crc kubenswrapper[4745]: I1209 11:53:22.494878 4745 generic.go:334] "Generic (PLEG): container finished" podID="753eb7ac-080d-4a10-9bba-e3ed44d80985" containerID="0885f35be48b9edafa3024a22995fb186c052c7538483ed9d73e519b4bd4b0f3" exitCode=0 Dec 09 11:53:22 crc kubenswrapper[4745]: I1209 11:53:22.494945 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qh9lk" event={"ID":"753eb7ac-080d-4a10-9bba-e3ed44d80985","Type":"ContainerDied","Data":"0885f35be48b9edafa3024a22995fb186c052c7538483ed9d73e519b4bd4b0f3"} Dec 09 11:53:22 crc kubenswrapper[4745]: I1209 11:53:22.499613 4745 generic.go:334] "Generic (PLEG): container finished" podID="00d4e762-f4d9-4409-9051-d396157c0e90" containerID="e1b7bff6592b36237dc72ae840ee30fdfd4ffc88f26ccd26119c13ab09e03659" exitCode=0 Dec 09 11:53:22 crc kubenswrapper[4745]: I1209 11:53:22.499684 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4966q" event={"ID":"00d4e762-f4d9-4409-9051-d396157c0e90","Type":"ContainerDied","Data":"e1b7bff6592b36237dc72ae840ee30fdfd4ffc88f26ccd26119c13ab09e03659"} Dec 09 11:53:22 crc kubenswrapper[4745]: E1209 11:53:22.618109 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod311eddda_625a_4029_ba56_b408b2242eb5.slice/crio-ff9a5a6e802cf0168d9d01d325c9e7da33cbb8d5f307314d40c85aa0596d29c7\": RecentStats: unable to find data in memory cache]" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.464011 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qh9lk" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.470495 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1fc4-account-create-update-qwf8c" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.475183 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.475263 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.494720 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-686d-account-create-update-lkjxb" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.524883 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-83f7-account-create-update-t6hf9" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.537576 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/753eb7ac-080d-4a10-9bba-e3ed44d80985-operator-scripts\") pod \"753eb7ac-080d-4a10-9bba-e3ed44d80985\" (UID: \"753eb7ac-080d-4a10-9bba-e3ed44d80985\") " Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.537626 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2319532e-1ba5-4616-8363-16f109438bd8-operator-scripts\") pod \"2319532e-1ba5-4616-8363-16f109438bd8\" (UID: \"2319532e-1ba5-4616-8363-16f109438bd8\") " Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.537745 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9mpj\" (UniqueName: \"kubernetes.io/projected/2319532e-1ba5-4616-8363-16f109438bd8-kube-api-access-v9mpj\") pod \"2319532e-1ba5-4616-8363-16f109438bd8\" (UID: \"2319532e-1ba5-4616-8363-16f109438bd8\") " Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.537780 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b254d08-6ebd-492e-b955-afc9bed1b627-operator-scripts\") pod \"1b254d08-6ebd-492e-b955-afc9bed1b627\" (UID: \"1b254d08-6ebd-492e-b955-afc9bed1b627\") " Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.537889 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nckp\" (UniqueName: \"kubernetes.io/projected/1b254d08-6ebd-492e-b955-afc9bed1b627-kube-api-access-8nckp\") pod \"1b254d08-6ebd-492e-b955-afc9bed1b627\" (UID: \"1b254d08-6ebd-492e-b955-afc9bed1b627\") " Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.537982 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szcgh\" (UniqueName: \"kubernetes.io/projected/753eb7ac-080d-4a10-9bba-e3ed44d80985-kube-api-access-szcgh\") pod \"753eb7ac-080d-4a10-9bba-e3ed44d80985\" (UID: \"753eb7ac-080d-4a10-9bba-e3ed44d80985\") " Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.538005 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shpxz\" (UniqueName: \"kubernetes.io/projected/12417d7c-b1ee-406f-b60c-415562c15782-kube-api-access-shpxz\") pod \"12417d7c-b1ee-406f-b60c-415562c15782\" (UID: \"12417d7c-b1ee-406f-b60c-415562c15782\") " Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.538047 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12417d7c-b1ee-406f-b60c-415562c15782-operator-scripts\") pod \"12417d7c-b1ee-406f-b60c-415562c15782\" (UID: \"12417d7c-b1ee-406f-b60c-415562c15782\") " Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.539017 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b254d08-6ebd-492e-b955-afc9bed1b627-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b254d08-6ebd-492e-b955-afc9bed1b627" (UID: "1b254d08-6ebd-492e-b955-afc9bed1b627"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.539042 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12417d7c-b1ee-406f-b60c-415562c15782-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "12417d7c-b1ee-406f-b60c-415562c15782" (UID: "12417d7c-b1ee-406f-b60c-415562c15782"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.539069 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2319532e-1ba5-4616-8363-16f109438bd8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2319532e-1ba5-4616-8363-16f109438bd8" (UID: "2319532e-1ba5-4616-8363-16f109438bd8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.542004 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753eb7ac-080d-4a10-9bba-e3ed44d80985-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "753eb7ac-080d-4a10-9bba-e3ed44d80985" (UID: "753eb7ac-080d-4a10-9bba-e3ed44d80985"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.542283 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b254d08-6ebd-492e-b955-afc9bed1b627-kube-api-access-8nckp" (OuterVolumeSpecName: "kube-api-access-8nckp") pod "1b254d08-6ebd-492e-b955-afc9bed1b627" (UID: "1b254d08-6ebd-492e-b955-afc9bed1b627"). InnerVolumeSpecName "kube-api-access-8nckp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.543826 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4966q" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.545218 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2319532e-1ba5-4616-8363-16f109438bd8-kube-api-access-v9mpj" (OuterVolumeSpecName: "kube-api-access-v9mpj") pod "2319532e-1ba5-4616-8363-16f109438bd8" (UID: "2319532e-1ba5-4616-8363-16f109438bd8"). InnerVolumeSpecName "kube-api-access-v9mpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.547403 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-686d-account-create-update-lkjxb" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.549187 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/753eb7ac-080d-4a10-9bba-e3ed44d80985-kube-api-access-szcgh" (OuterVolumeSpecName: "kube-api-access-szcgh") pod "753eb7ac-080d-4a10-9bba-e3ed44d80985" (UID: "753eb7ac-080d-4a10-9bba-e3ed44d80985"). InnerVolumeSpecName "kube-api-access-szcgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.549203 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-686d-account-create-update-lkjxb" event={"ID":"1b254d08-6ebd-492e-b955-afc9bed1b627","Type":"ContainerDied","Data":"36aff8ee534f56271365c29bec298c7b7ce859ba464b31e35dd3b6f157b17c30"} Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.549262 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36aff8ee534f56271365c29bec298c7b7ce859ba464b31e35dd3b6f157b17c30" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.554105 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1fc4-account-create-update-qwf8c" event={"ID":"2319532e-1ba5-4616-8363-16f109438bd8","Type":"ContainerDied","Data":"0203c33a4d9d2e6fc6b10c1e0b6227f5c61cbbe31209f0a79febd75533e31690"} Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.554215 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0203c33a4d9d2e6fc6b10c1e0b6227f5c61cbbe31209f0a79febd75533e31690" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.554234 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1fc4-account-create-update-qwf8c" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.557971 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12417d7c-b1ee-406f-b60c-415562c15782-kube-api-access-shpxz" (OuterVolumeSpecName: "kube-api-access-shpxz") pod "12417d7c-b1ee-406f-b60c-415562c15782" (UID: "12417d7c-b1ee-406f-b60c-415562c15782"). InnerVolumeSpecName "kube-api-access-shpxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.564089 4745 generic.go:334] "Generic (PLEG): container finished" podID="bf657ee5-9433-4e1a-9a66-33f59c0e5b0a" containerID="67cf2989b9dc0e9ab08763199525684b72ba2bee0348375259236deaa1092c59" exitCode=0 Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.566098 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-83f7-account-create-update-t6hf9" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.579729 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4966q" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.584500 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qh9lk" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.595679 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xl2mt" event={"ID":"d0683710-3d66-4128-981a-227590aa97a0","Type":"ContainerDied","Data":"a21d42fd6579828e1f8e2495dfa1caef7cfa51d42d171b12c595de494d4e2221"} Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.595718 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a21d42fd6579828e1f8e2495dfa1caef7cfa51d42d171b12c595de494d4e2221" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.595734 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pw7nc" event={"ID":"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a","Type":"ContainerDied","Data":"67cf2989b9dc0e9ab08763199525684b72ba2bee0348375259236deaa1092c59"} Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.595749 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-83f7-account-create-update-t6hf9" event={"ID":"12417d7c-b1ee-406f-b60c-415562c15782","Type":"ContainerDied","Data":"a3dc9859db8b775874f0b0b1880e962969ac34bae8706c6a5bb9b5f9de695c4b"} Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.595782 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3dc9859db8b775874f0b0b1880e962969ac34bae8706c6a5bb9b5f9de695c4b" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.595793 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4966q" event={"ID":"00d4e762-f4d9-4409-9051-d396157c0e90","Type":"ContainerDied","Data":"c6e4c11dc5739b4ddc2a4dcd76b36e54c14e39ad93bd1c482459d33bce9c2f25"} Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.595804 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6e4c11dc5739b4ddc2a4dcd76b36e54c14e39ad93bd1c482459d33bce9c2f25" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.595814 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qh9lk" event={"ID":"753eb7ac-080d-4a10-9bba-e3ed44d80985","Type":"ContainerDied","Data":"af7e6c2b1feb635cae91f168d18fc988de3aea199e3db2ed3d9998cfc09c0942"} Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.595824 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af7e6c2b1feb635cae91f168d18fc988de3aea199e3db2ed3d9998cfc09c0942" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.599260 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xl2mt" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.639117 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5q4x\" (UniqueName: \"kubernetes.io/projected/00d4e762-f4d9-4409-9051-d396157c0e90-kube-api-access-m5q4x\") pod \"00d4e762-f4d9-4409-9051-d396157c0e90\" (UID: \"00d4e762-f4d9-4409-9051-d396157c0e90\") " Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.639899 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zsp5\" (UniqueName: \"kubernetes.io/projected/d0683710-3d66-4128-981a-227590aa97a0-kube-api-access-9zsp5\") pod \"d0683710-3d66-4128-981a-227590aa97a0\" (UID: \"d0683710-3d66-4128-981a-227590aa97a0\") " Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.640008 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00d4e762-f4d9-4409-9051-d396157c0e90-operator-scripts\") pod \"00d4e762-f4d9-4409-9051-d396157c0e90\" (UID: \"00d4e762-f4d9-4409-9051-d396157c0e90\") " Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.640072 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0683710-3d66-4128-981a-227590aa97a0-operator-scripts\") pod \"d0683710-3d66-4128-981a-227590aa97a0\" (UID: \"d0683710-3d66-4128-981a-227590aa97a0\") " Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.641007 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/753eb7ac-080d-4a10-9bba-e3ed44d80985-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.641043 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2319532e-1ba5-4616-8363-16f109438bd8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.641057 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9mpj\" (UniqueName: \"kubernetes.io/projected/2319532e-1ba5-4616-8363-16f109438bd8-kube-api-access-v9mpj\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.641075 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b254d08-6ebd-492e-b955-afc9bed1b627-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.641091 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nckp\" (UniqueName: \"kubernetes.io/projected/1b254d08-6ebd-492e-b955-afc9bed1b627-kube-api-access-8nckp\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.641103 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szcgh\" (UniqueName: \"kubernetes.io/projected/753eb7ac-080d-4a10-9bba-e3ed44d80985-kube-api-access-szcgh\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.641115 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shpxz\" (UniqueName: \"kubernetes.io/projected/12417d7c-b1ee-406f-b60c-415562c15782-kube-api-access-shpxz\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.641127 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12417d7c-b1ee-406f-b60c-415562c15782-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.642012 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d4e762-f4d9-4409-9051-d396157c0e90-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "00d4e762-f4d9-4409-9051-d396157c0e90" (UID: "00d4e762-f4d9-4409-9051-d396157c0e90"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.642461 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0683710-3d66-4128-981a-227590aa97a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0683710-3d66-4128-981a-227590aa97a0" (UID: "d0683710-3d66-4128-981a-227590aa97a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.649889 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00d4e762-f4d9-4409-9051-d396157c0e90-kube-api-access-m5q4x" (OuterVolumeSpecName: "kube-api-access-m5q4x") pod "00d4e762-f4d9-4409-9051-d396157c0e90" (UID: "00d4e762-f4d9-4409-9051-d396157c0e90"). InnerVolumeSpecName "kube-api-access-m5q4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.650001 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0683710-3d66-4128-981a-227590aa97a0-kube-api-access-9zsp5" (OuterVolumeSpecName: "kube-api-access-9zsp5") pod "d0683710-3d66-4128-981a-227590aa97a0" (UID: "d0683710-3d66-4128-981a-227590aa97a0"). InnerVolumeSpecName "kube-api-access-9zsp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.741751 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zsp5\" (UniqueName: \"kubernetes.io/projected/d0683710-3d66-4128-981a-227590aa97a0-kube-api-access-9zsp5\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.741794 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00d4e762-f4d9-4409-9051-d396157c0e90-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.741807 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0683710-3d66-4128-981a-227590aa97a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.741824 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5q4x\" (UniqueName: \"kubernetes.io/projected/00d4e762-f4d9-4409-9051-d396157c0e90-kube-api-access-m5q4x\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:25 crc kubenswrapper[4745]: I1209 11:53:25.974848 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.065822 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-dn8js"] Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.066095 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" podUID="12a5c3fa-478a-4716-b597-6b6b54eb274a" containerName="dnsmasq-dns" containerID="cri-o://c328f257fb9e615113feddb465766fca761be4e1ddccefe990271d40397dbcb8" gracePeriod=10 Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.528159 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.572291 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-ovsdbserver-sb\") pod \"12a5c3fa-478a-4716-b597-6b6b54eb274a\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.572450 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-config\") pod \"12a5c3fa-478a-4716-b597-6b6b54eb274a\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.572546 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-ovsdbserver-nb\") pod \"12a5c3fa-478a-4716-b597-6b6b54eb274a\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.572611 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mzjp\" (UniqueName: \"kubernetes.io/projected/12a5c3fa-478a-4716-b597-6b6b54eb274a-kube-api-access-7mzjp\") pod \"12a5c3fa-478a-4716-b597-6b6b54eb274a\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.572655 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-dns-svc\") pod \"12a5c3fa-478a-4716-b597-6b6b54eb274a\" (UID: \"12a5c3fa-478a-4716-b597-6b6b54eb274a\") " Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.646994 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12a5c3fa-478a-4716-b597-6b6b54eb274a-kube-api-access-7mzjp" (OuterVolumeSpecName: "kube-api-access-7mzjp") pod "12a5c3fa-478a-4716-b597-6b6b54eb274a" (UID: "12a5c3fa-478a-4716-b597-6b6b54eb274a"). InnerVolumeSpecName "kube-api-access-7mzjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.673223 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p9tff" event={"ID":"29c95beb-2818-4a46-813c-b53fedce2a59","Type":"ContainerStarted","Data":"4aace1f8289737b036612bc10e88c7ddf6bde91fa3445accdbdcc74188e82c74"} Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.674328 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mzjp\" (UniqueName: \"kubernetes.io/projected/12a5c3fa-478a-4716-b597-6b6b54eb274a-kube-api-access-7mzjp\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.678352 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "12a5c3fa-478a-4716-b597-6b6b54eb274a" (UID: "12a5c3fa-478a-4716-b597-6b6b54eb274a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.696775 4745 generic.go:334] "Generic (PLEG): container finished" podID="12a5c3fa-478a-4716-b597-6b6b54eb274a" containerID="c328f257fb9e615113feddb465766fca761be4e1ddccefe990271d40397dbcb8" exitCode=0 Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.697090 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.697489 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" event={"ID":"12a5c3fa-478a-4716-b597-6b6b54eb274a","Type":"ContainerDied","Data":"c328f257fb9e615113feddb465766fca761be4e1ddccefe990271d40397dbcb8"} Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.697544 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-dn8js" event={"ID":"12a5c3fa-478a-4716-b597-6b6b54eb274a","Type":"ContainerDied","Data":"8f61721a5bd89a50ba16b8000257f20fb6efe1c08c924d446f41ff935e6a1844"} Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.697568 4745 scope.go:117] "RemoveContainer" containerID="c328f257fb9e615113feddb465766fca761be4e1ddccefe990271d40397dbcb8" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.697716 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xl2mt" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.718007 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "12a5c3fa-478a-4716-b597-6b6b54eb274a" (UID: "12a5c3fa-478a-4716-b597-6b6b54eb274a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.749081 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "12a5c3fa-478a-4716-b597-6b6b54eb274a" (UID: "12a5c3fa-478a-4716-b597-6b6b54eb274a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.770413 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-config" (OuterVolumeSpecName: "config") pod "12a5c3fa-478a-4716-b597-6b6b54eb274a" (UID: "12a5c3fa-478a-4716-b597-6b6b54eb274a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.775126 4745 scope.go:117] "RemoveContainer" containerID="5e819e4010b15dee9bf4896fb4269aaf967028ab3662e37643bf573ee3d16a54" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.776775 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.776796 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.776808 4745 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.776817 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12a5c3fa-478a-4716-b597-6b6b54eb274a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.782472 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-p9tff" podStartSLOduration=3.229080002 podStartE2EDuration="7.782447509s" podCreationTimestamp="2025-12-09 11:53:19 +0000 UTC" firstStartedPulling="2025-12-09 11:53:20.769668536 +0000 UTC m=+1287.594870060" lastFinishedPulling="2025-12-09 11:53:25.323036043 +0000 UTC m=+1292.148237567" observedRunningTime="2025-12-09 11:53:26.742962696 +0000 UTC m=+1293.568164220" watchObservedRunningTime="2025-12-09 11:53:26.782447509 +0000 UTC m=+1293.607649033" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.885665 4745 scope.go:117] "RemoveContainer" containerID="c328f257fb9e615113feddb465766fca761be4e1ddccefe990271d40397dbcb8" Dec 09 11:53:26 crc kubenswrapper[4745]: E1209 11:53:26.886309 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c328f257fb9e615113feddb465766fca761be4e1ddccefe990271d40397dbcb8\": container with ID starting with c328f257fb9e615113feddb465766fca761be4e1ddccefe990271d40397dbcb8 not found: ID does not exist" containerID="c328f257fb9e615113feddb465766fca761be4e1ddccefe990271d40397dbcb8" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.886372 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c328f257fb9e615113feddb465766fca761be4e1ddccefe990271d40397dbcb8"} err="failed to get container status \"c328f257fb9e615113feddb465766fca761be4e1ddccefe990271d40397dbcb8\": rpc error: code = NotFound desc = could not find container \"c328f257fb9e615113feddb465766fca761be4e1ddccefe990271d40397dbcb8\": container with ID starting with c328f257fb9e615113feddb465766fca761be4e1ddccefe990271d40397dbcb8 not found: ID does not exist" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.886413 4745 scope.go:117] "RemoveContainer" containerID="5e819e4010b15dee9bf4896fb4269aaf967028ab3662e37643bf573ee3d16a54" Dec 09 11:53:26 crc kubenswrapper[4745]: E1209 11:53:26.887951 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e819e4010b15dee9bf4896fb4269aaf967028ab3662e37643bf573ee3d16a54\": container with ID starting with 5e819e4010b15dee9bf4896fb4269aaf967028ab3662e37643bf573ee3d16a54 not found: ID does not exist" containerID="5e819e4010b15dee9bf4896fb4269aaf967028ab3662e37643bf573ee3d16a54" Dec 09 11:53:26 crc kubenswrapper[4745]: I1209 11:53:26.888068 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e819e4010b15dee9bf4896fb4269aaf967028ab3662e37643bf573ee3d16a54"} err="failed to get container status \"5e819e4010b15dee9bf4896fb4269aaf967028ab3662e37643bf573ee3d16a54\": rpc error: code = NotFound desc = could not find container \"5e819e4010b15dee9bf4896fb4269aaf967028ab3662e37643bf573ee3d16a54\": container with ID starting with 5e819e4010b15dee9bf4896fb4269aaf967028ab3662e37643bf573ee3d16a54 not found: ID does not exist" Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.038333 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-dn8js"] Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.045491 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-dn8js"] Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.220293 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pw7nc" Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.287148 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kj79\" (UniqueName: \"kubernetes.io/projected/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-kube-api-access-9kj79\") pod \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\" (UID: \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\") " Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.287221 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-config-data\") pod \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\" (UID: \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\") " Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.287338 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-db-sync-config-data\") pod \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\" (UID: \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\") " Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.287558 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-combined-ca-bundle\") pod \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\" (UID: \"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a\") " Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.296612 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bf657ee5-9433-4e1a-9a66-33f59c0e5b0a" (UID: "bf657ee5-9433-4e1a-9a66-33f59c0e5b0a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.297732 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-kube-api-access-9kj79" (OuterVolumeSpecName: "kube-api-access-9kj79") pod "bf657ee5-9433-4e1a-9a66-33f59c0e5b0a" (UID: "bf657ee5-9433-4e1a-9a66-33f59c0e5b0a"). InnerVolumeSpecName "kube-api-access-9kj79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.330808 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf657ee5-9433-4e1a-9a66-33f59c0e5b0a" (UID: "bf657ee5-9433-4e1a-9a66-33f59c0e5b0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.346681 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-config-data" (OuterVolumeSpecName: "config-data") pod "bf657ee5-9433-4e1a-9a66-33f59c0e5b0a" (UID: "bf657ee5-9433-4e1a-9a66-33f59c0e5b0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.390286 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kj79\" (UniqueName: \"kubernetes.io/projected/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-kube-api-access-9kj79\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.390334 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.390345 4745 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.390354 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.566637 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12a5c3fa-478a-4716-b597-6b6b54eb274a" path="/var/lib/kubelet/pods/12a5c3fa-478a-4716-b597-6b6b54eb274a/volumes" Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.710421 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pw7nc" event={"ID":"bf657ee5-9433-4e1a-9a66-33f59c0e5b0a","Type":"ContainerDied","Data":"522405e50301614f309ce0bac48f6418a6ac2927d286c166b70c8a7d455cd0ac"} Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.710479 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="522405e50301614f309ce0bac48f6418a6ac2927d286c166b70c8a7d455cd0ac" Dec 09 11:53:27 crc kubenswrapper[4745]: I1209 11:53:27.710481 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pw7nc" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.074125 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dbc684849-rh6zl"] Dec 09 11:53:28 crc kubenswrapper[4745]: E1209 11:53:28.074637 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12417d7c-b1ee-406f-b60c-415562c15782" containerName="mariadb-account-create-update" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.074657 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="12417d7c-b1ee-406f-b60c-415562c15782" containerName="mariadb-account-create-update" Dec 09 11:53:28 crc kubenswrapper[4745]: E1209 11:53:28.074682 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b254d08-6ebd-492e-b955-afc9bed1b627" containerName="mariadb-account-create-update" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.074692 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b254d08-6ebd-492e-b955-afc9bed1b627" containerName="mariadb-account-create-update" Dec 09 11:53:28 crc kubenswrapper[4745]: E1209 11:53:28.074702 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d4e762-f4d9-4409-9051-d396157c0e90" containerName="mariadb-database-create" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.074709 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d4e762-f4d9-4409-9051-d396157c0e90" containerName="mariadb-database-create" Dec 09 11:53:28 crc kubenswrapper[4745]: E1209 11:53:28.074720 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="753eb7ac-080d-4a10-9bba-e3ed44d80985" containerName="mariadb-database-create" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.074726 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="753eb7ac-080d-4a10-9bba-e3ed44d80985" containerName="mariadb-database-create" Dec 09 11:53:28 crc kubenswrapper[4745]: E1209 11:53:28.074739 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2319532e-1ba5-4616-8363-16f109438bd8" containerName="mariadb-account-create-update" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.074749 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2319532e-1ba5-4616-8363-16f109438bd8" containerName="mariadb-account-create-update" Dec 09 11:53:28 crc kubenswrapper[4745]: E1209 11:53:28.074761 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0683710-3d66-4128-981a-227590aa97a0" containerName="mariadb-database-create" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.074767 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0683710-3d66-4128-981a-227590aa97a0" containerName="mariadb-database-create" Dec 09 11:53:28 crc kubenswrapper[4745]: E1209 11:53:28.074779 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf657ee5-9433-4e1a-9a66-33f59c0e5b0a" containerName="glance-db-sync" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.074785 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf657ee5-9433-4e1a-9a66-33f59c0e5b0a" containerName="glance-db-sync" Dec 09 11:53:28 crc kubenswrapper[4745]: E1209 11:53:28.074802 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a5c3fa-478a-4716-b597-6b6b54eb274a" containerName="init" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.074808 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a5c3fa-478a-4716-b597-6b6b54eb274a" containerName="init" Dec 09 11:53:28 crc kubenswrapper[4745]: E1209 11:53:28.074817 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a5c3fa-478a-4716-b597-6b6b54eb274a" containerName="dnsmasq-dns" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.074823 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a5c3fa-478a-4716-b597-6b6b54eb274a" containerName="dnsmasq-dns" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.075006 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b254d08-6ebd-492e-b955-afc9bed1b627" containerName="mariadb-account-create-update" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.075022 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a5c3fa-478a-4716-b597-6b6b54eb274a" containerName="dnsmasq-dns" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.075030 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf657ee5-9433-4e1a-9a66-33f59c0e5b0a" containerName="glance-db-sync" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.075046 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0683710-3d66-4128-981a-227590aa97a0" containerName="mariadb-database-create" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.075056 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="753eb7ac-080d-4a10-9bba-e3ed44d80985" containerName="mariadb-database-create" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.075071 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="00d4e762-f4d9-4409-9051-d396157c0e90" containerName="mariadb-database-create" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.075080 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="12417d7c-b1ee-406f-b60c-415562c15782" containerName="mariadb-account-create-update" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.075089 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="2319532e-1ba5-4616-8363-16f109438bd8" containerName="mariadb-account-create-update" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.077194 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.107168 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-dns-svc\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.107280 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-config\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.107370 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-ovsdbserver-nb\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.107412 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-dns-swift-storage-0\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.107436 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-ovsdbserver-sb\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.107486 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qs56\" (UniqueName: \"kubernetes.io/projected/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-kube-api-access-8qs56\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.109697 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dbc684849-rh6zl"] Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.248966 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qs56\" (UniqueName: \"kubernetes.io/projected/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-kube-api-access-8qs56\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.249055 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-dns-svc\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.249108 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-config\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.249225 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-ovsdbserver-nb\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.249260 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-dns-swift-storage-0\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.249278 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-ovsdbserver-sb\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.251047 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-config\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.251112 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-ovsdbserver-sb\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.251827 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-dns-swift-storage-0\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.254116 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-dns-svc\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.254139 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-ovsdbserver-nb\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.270757 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qs56\" (UniqueName: \"kubernetes.io/projected/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-kube-api-access-8qs56\") pod \"dnsmasq-dns-6dbc684849-rh6zl\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:28 crc kubenswrapper[4745]: I1209 11:53:28.425749 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:29 crc kubenswrapper[4745]: I1209 11:53:29.082923 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dbc684849-rh6zl"] Dec 09 11:53:29 crc kubenswrapper[4745]: I1209 11:53:29.753085 4745 generic.go:334] "Generic (PLEG): container finished" podID="d4142b9d-e631-4eda-8c14-dda2ca6c59bd" containerID="a5dca1900d951ebe9b682fb2cb2c4e9b00a2246df30edfba6bfc7058fb8c77d7" exitCode=0 Dec 09 11:53:29 crc kubenswrapper[4745]: I1209 11:53:29.753282 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" event={"ID":"d4142b9d-e631-4eda-8c14-dda2ca6c59bd","Type":"ContainerDied","Data":"a5dca1900d951ebe9b682fb2cb2c4e9b00a2246df30edfba6bfc7058fb8c77d7"} Dec 09 11:53:29 crc kubenswrapper[4745]: I1209 11:53:29.753433 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" event={"ID":"d4142b9d-e631-4eda-8c14-dda2ca6c59bd","Type":"ContainerStarted","Data":"9f01241ec9826f6407a46e22f8eb5467c9fee09b8a3369276ffed529939a964a"} Dec 09 11:53:30 crc kubenswrapper[4745]: I1209 11:53:30.765415 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" event={"ID":"d4142b9d-e631-4eda-8c14-dda2ca6c59bd","Type":"ContainerStarted","Data":"f5d78e571eff9679a138beaedc658c3b16d4a65ed2a33e6ee698e05f4b907049"} Dec 09 11:53:30 crc kubenswrapper[4745]: I1209 11:53:30.766048 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:30 crc kubenswrapper[4745]: I1209 11:53:30.768197 4745 generic.go:334] "Generic (PLEG): container finished" podID="29c95beb-2818-4a46-813c-b53fedce2a59" containerID="4aace1f8289737b036612bc10e88c7ddf6bde91fa3445accdbdcc74188e82c74" exitCode=0 Dec 09 11:53:30 crc kubenswrapper[4745]: I1209 11:53:30.768244 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p9tff" event={"ID":"29c95beb-2818-4a46-813c-b53fedce2a59","Type":"ContainerDied","Data":"4aace1f8289737b036612bc10e88c7ddf6bde91fa3445accdbdcc74188e82c74"} Dec 09 11:53:30 crc kubenswrapper[4745]: I1209 11:53:30.799044 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" podStartSLOduration=2.799021212 podStartE2EDuration="2.799021212s" podCreationTimestamp="2025-12-09 11:53:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:53:30.78446258 +0000 UTC m=+1297.609664114" watchObservedRunningTime="2025-12-09 11:53:30.799021212 +0000 UTC m=+1297.624222736" Dec 09 11:53:32 crc kubenswrapper[4745]: I1209 11:53:32.117618 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p9tff" Dec 09 11:53:32 crc kubenswrapper[4745]: I1209 11:53:32.286663 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c95beb-2818-4a46-813c-b53fedce2a59-config-data\") pod \"29c95beb-2818-4a46-813c-b53fedce2a59\" (UID: \"29c95beb-2818-4a46-813c-b53fedce2a59\") " Dec 09 11:53:32 crc kubenswrapper[4745]: I1209 11:53:32.287103 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7tnp\" (UniqueName: \"kubernetes.io/projected/29c95beb-2818-4a46-813c-b53fedce2a59-kube-api-access-s7tnp\") pod \"29c95beb-2818-4a46-813c-b53fedce2a59\" (UID: \"29c95beb-2818-4a46-813c-b53fedce2a59\") " Dec 09 11:53:32 crc kubenswrapper[4745]: I1209 11:53:32.287201 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c95beb-2818-4a46-813c-b53fedce2a59-combined-ca-bundle\") pod \"29c95beb-2818-4a46-813c-b53fedce2a59\" (UID: \"29c95beb-2818-4a46-813c-b53fedce2a59\") " Dec 09 11:53:32 crc kubenswrapper[4745]: I1209 11:53:32.292284 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29c95beb-2818-4a46-813c-b53fedce2a59-kube-api-access-s7tnp" (OuterVolumeSpecName: "kube-api-access-s7tnp") pod "29c95beb-2818-4a46-813c-b53fedce2a59" (UID: "29c95beb-2818-4a46-813c-b53fedce2a59"). InnerVolumeSpecName "kube-api-access-s7tnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:53:32 crc kubenswrapper[4745]: I1209 11:53:32.323883 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29c95beb-2818-4a46-813c-b53fedce2a59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29c95beb-2818-4a46-813c-b53fedce2a59" (UID: "29c95beb-2818-4a46-813c-b53fedce2a59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:32 crc kubenswrapper[4745]: I1209 11:53:32.351346 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29c95beb-2818-4a46-813c-b53fedce2a59-config-data" (OuterVolumeSpecName: "config-data") pod "29c95beb-2818-4a46-813c-b53fedce2a59" (UID: "29c95beb-2818-4a46-813c-b53fedce2a59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:32 crc kubenswrapper[4745]: I1209 11:53:32.388913 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c95beb-2818-4a46-813c-b53fedce2a59-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:32 crc kubenswrapper[4745]: I1209 11:53:32.388947 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7tnp\" (UniqueName: \"kubernetes.io/projected/29c95beb-2818-4a46-813c-b53fedce2a59-kube-api-access-s7tnp\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:32 crc kubenswrapper[4745]: I1209 11:53:32.388958 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c95beb-2818-4a46-813c-b53fedce2a59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:32 crc kubenswrapper[4745]: I1209 11:53:32.795539 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p9tff" event={"ID":"29c95beb-2818-4a46-813c-b53fedce2a59","Type":"ContainerDied","Data":"8f13d0f13b9ccb652de7ae162e32ffa14aa9bd0d58f2b8d119a63bcc688420d7"} Dec 09 11:53:32 crc kubenswrapper[4745]: I1209 11:53:32.796450 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f13d0f13b9ccb652de7ae162e32ffa14aa9bd0d58f2b8d119a63bcc688420d7" Dec 09 11:53:32 crc kubenswrapper[4745]: I1209 11:53:32.795967 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p9tff" Dec 09 11:53:32 crc kubenswrapper[4745]: E1209 11:53:32.975458 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod311eddda_625a_4029_ba56_b408b2242eb5.slice/crio-ff9a5a6e802cf0168d9d01d325c9e7da33cbb8d5f307314d40c85aa0596d29c7\": RecentStats: unable to find data in memory cache]" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.190475 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dbc684849-rh6zl"] Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.191700 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" podUID="d4142b9d-e631-4eda-8c14-dda2ca6c59bd" containerName="dnsmasq-dns" containerID="cri-o://f5d78e571eff9679a138beaedc658c3b16d4a65ed2a33e6ee698e05f4b907049" gracePeriod=10 Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.236641 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6974cb66c7-c6kll"] Dec 09 11:53:33 crc kubenswrapper[4745]: E1209 11:53:33.237044 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c95beb-2818-4a46-813c-b53fedce2a59" containerName="keystone-db-sync" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.237056 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c95beb-2818-4a46-813c-b53fedce2a59" containerName="keystone-db-sync" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.237248 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="29c95beb-2818-4a46-813c-b53fedce2a59" containerName="keystone-db-sync" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.238385 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.259245 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7dpkp"] Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.260754 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.272102 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.272297 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fh2d2" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.272562 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.272690 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.272794 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.281480 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6974cb66c7-c6kll"] Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.301748 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7dpkp"] Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.341213 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-combined-ca-bundle\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.341460 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-ovsdbserver-sb\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.341557 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-dns-svc\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.341659 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-config-data\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.341739 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-dns-swift-storage-0\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.341812 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-config\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.341886 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-fernet-keys\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.341957 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-ovsdbserver-nb\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.342035 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hhnh\" (UniqueName: \"kubernetes.io/projected/c64c8ae5-485f-49b0-b232-fb97516a2ad2-kube-api-access-2hhnh\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.342152 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt2hk\" (UniqueName: \"kubernetes.io/projected/25175de5-83bb-442d-b1ff-92f119310608-kube-api-access-vt2hk\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.342228 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-credential-keys\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.342302 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-scripts\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.443705 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-combined-ca-bundle\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.443746 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-ovsdbserver-sb\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.443778 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-dns-svc\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.443821 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-config-data\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.443865 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-dns-swift-storage-0\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.443897 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-config\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.443922 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-fernet-keys\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.443942 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-ovsdbserver-nb\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.443962 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hhnh\" (UniqueName: \"kubernetes.io/projected/c64c8ae5-485f-49b0-b232-fb97516a2ad2-kube-api-access-2hhnh\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.443994 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt2hk\" (UniqueName: \"kubernetes.io/projected/25175de5-83bb-442d-b1ff-92f119310608-kube-api-access-vt2hk\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.444012 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-credential-keys\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.444033 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-scripts\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.445833 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-config\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.451758 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-combined-ca-bundle\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.451773 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-fernet-keys\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.454993 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-dns-svc\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.461200 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-credential-keys\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.461257 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-config-data\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.461556 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-scripts\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.465428 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-ovsdbserver-sb\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.474216 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-dns-swift-storage-0\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.475496 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hhnh\" (UniqueName: \"kubernetes.io/projected/c64c8ae5-485f-49b0-b232-fb97516a2ad2-kube-api-access-2hhnh\") pod \"keystone-bootstrap-7dpkp\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.477693 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-ovsdbserver-nb\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.493950 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.497921 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.501623 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.502955 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.529617 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt2hk\" (UniqueName: \"kubernetes.io/projected/25175de5-83bb-442d-b1ff-92f119310608-kube-api-access-vt2hk\") pod \"dnsmasq-dns-6974cb66c7-c6kll\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.532420 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.548819 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ln76\" (UniqueName: \"kubernetes.io/projected/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-kube-api-access-8ln76\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.548886 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-scripts\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.548932 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-config-data\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.548961 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.548981 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-run-httpd\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.549000 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-log-httpd\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.549021 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.553685 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-mj74p"] Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.558275 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.563429 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.592596 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ql62n" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.595791 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.601100 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.617424 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.695017 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.695408 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-run-httpd\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.695457 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-log-httpd\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.695523 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.695741 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ln76\" (UniqueName: \"kubernetes.io/projected/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-kube-api-access-8ln76\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.695889 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-scripts\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.697989 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-config-data\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.699902 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-log-httpd\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.712376 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-run-httpd\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.713423 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.717869 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.728063 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-scripts\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.731092 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-config-data\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.751824 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mj74p"] Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.767789 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-jmljq"] Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.769929 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jmljq" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.784102 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jmljq"] Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.791379 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.791812 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4n696" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.793878 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ln76\" (UniqueName: \"kubernetes.io/projected/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-kube-api-access-8ln76\") pod \"ceilometer-0\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.797565 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-9zn5l"] Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.799142 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9zn5l" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.802956 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa1920d0-1725-4be7-baa4-e6561fcce10c-etc-machine-id\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.803031 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-config-data\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.803053 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz425\" (UniqueName: \"kubernetes.io/projected/aa1920d0-1725-4be7-baa4-e6561fcce10c-kube-api-access-tz425\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.803132 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-db-sync-config-data\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.803156 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-combined-ca-bundle\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.803282 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-scripts\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.809041 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.809249 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.809342 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6974cb66c7-c6kll"] Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.818686 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9zn5l"] Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.819713 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hqlhg" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.881347 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66567888d7-bl7rg"] Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.883353 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.902050 4745 generic.go:334] "Generic (PLEG): container finished" podID="d4142b9d-e631-4eda-8c14-dda2ca6c59bd" containerID="f5d78e571eff9679a138beaedc658c3b16d4a65ed2a33e6ee698e05f4b907049" exitCode=0 Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.902096 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" event={"ID":"d4142b9d-e631-4eda-8c14-dda2ca6c59bd","Type":"ContainerDied","Data":"f5d78e571eff9679a138beaedc658c3b16d4a65ed2a33e6ee698e05f4b907049"} Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.932533 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.933689 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-db-sync-config-data\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.944632 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-combined-ca-bundle\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.944715 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-config\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.944773 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-combined-ca-bundle\") pod \"neutron-db-sync-9zn5l\" (UID: \"dcdefb66-0234-4aa7-97e4-bba6107a3e7d\") " pod="openstack/neutron-db-sync-9zn5l" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.949177 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kctn7\" (UniqueName: \"kubernetes.io/projected/cb3343f8-ef08-4d66-a004-cf5b1044dded-kube-api-access-kctn7\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.949258 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-config\") pod \"neutron-db-sync-9zn5l\" (UID: \"dcdefb66-0234-4aa7-97e4-bba6107a3e7d\") " pod="openstack/neutron-db-sync-9zn5l" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.949297 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72a305c7-8afb-4b56-90b1-e071980fbcdd-db-sync-config-data\") pod \"barbican-db-sync-jmljq\" (UID: \"72a305c7-8afb-4b56-90b1-e071980fbcdd\") " pod="openstack/barbican-db-sync-jmljq" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.949329 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-dns-svc\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.949386 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-scripts\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.949435 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr6c7\" (UniqueName: \"kubernetes.io/projected/72a305c7-8afb-4b56-90b1-e071980fbcdd-kube-api-access-qr6c7\") pod \"barbican-db-sync-jmljq\" (UID: \"72a305c7-8afb-4b56-90b1-e071980fbcdd\") " pod="openstack/barbican-db-sync-jmljq" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.949495 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa1920d0-1725-4be7-baa4-e6561fcce10c-etc-machine-id\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.949576 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-config-data\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.949610 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz425\" (UniqueName: \"kubernetes.io/projected/aa1920d0-1725-4be7-baa4-e6561fcce10c-kube-api-access-tz425\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.949647 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-ovsdbserver-sb\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.949697 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-dns-swift-storage-0\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.949740 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-ovsdbserver-nb\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.949790 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a305c7-8afb-4b56-90b1-e071980fbcdd-combined-ca-bundle\") pod \"barbican-db-sync-jmljq\" (UID: \"72a305c7-8afb-4b56-90b1-e071980fbcdd\") " pod="openstack/barbican-db-sync-jmljq" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.949832 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st28d\" (UniqueName: \"kubernetes.io/projected/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-kube-api-access-st28d\") pod \"neutron-db-sync-9zn5l\" (UID: \"dcdefb66-0234-4aa7-97e4-bba6107a3e7d\") " pod="openstack/neutron-db-sync-9zn5l" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.953801 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa1920d0-1725-4be7-baa4-e6561fcce10c-etc-machine-id\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.955642 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-n7q5f"] Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.956938 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.968432 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-scripts\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.972599 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-config-data\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.976716 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-db-sync-config-data\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:33 crc kubenswrapper[4745]: I1209 11:53:33.981255 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-combined-ca-bundle\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.011852 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zvjnk" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.012175 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.012334 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.025551 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz425\" (UniqueName: \"kubernetes.io/projected/aa1920d0-1725-4be7-baa4-e6561fcce10c-kube-api-access-tz425\") pod \"cinder-db-sync-mj74p\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.033817 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66567888d7-bl7rg"] Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.067321 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-config\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.067394 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-combined-ca-bundle\") pod \"neutron-db-sync-9zn5l\" (UID: \"dcdefb66-0234-4aa7-97e4-bba6107a3e7d\") " pod="openstack/neutron-db-sync-9zn5l" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.067437 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kctn7\" (UniqueName: \"kubernetes.io/projected/cb3343f8-ef08-4d66-a004-cf5b1044dded-kube-api-access-kctn7\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.067464 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-scripts\") pod \"placement-db-sync-n7q5f\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.067498 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-config\") pod \"neutron-db-sync-9zn5l\" (UID: \"dcdefb66-0234-4aa7-97e4-bba6107a3e7d\") " pod="openstack/neutron-db-sync-9zn5l" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.068900 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-config\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.069425 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72a305c7-8afb-4b56-90b1-e071980fbcdd-db-sync-config-data\") pod \"barbican-db-sync-jmljq\" (UID: \"72a305c7-8afb-4b56-90b1-e071980fbcdd\") " pod="openstack/barbican-db-sync-jmljq" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.069504 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-dns-svc\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.069846 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr6c7\" (UniqueName: \"kubernetes.io/projected/72a305c7-8afb-4b56-90b1-e071980fbcdd-kube-api-access-qr6c7\") pod \"barbican-db-sync-jmljq\" (UID: \"72a305c7-8afb-4b56-90b1-e071980fbcdd\") " pod="openstack/barbican-db-sync-jmljq" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.069917 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b162675-ea0a-4433-a94a-1f5bd6c81e01-logs\") pod \"placement-db-sync-n7q5f\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.069943 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-combined-ca-bundle\") pod \"placement-db-sync-n7q5f\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.069967 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-ovsdbserver-sb\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.070003 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-dns-swift-storage-0\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.070041 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-ovsdbserver-nb\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.070076 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a305c7-8afb-4b56-90b1-e071980fbcdd-combined-ca-bundle\") pod \"barbican-db-sync-jmljq\" (UID: \"72a305c7-8afb-4b56-90b1-e071980fbcdd\") " pod="openstack/barbican-db-sync-jmljq" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.070099 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st28d\" (UniqueName: \"kubernetes.io/projected/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-kube-api-access-st28d\") pod \"neutron-db-sync-9zn5l\" (UID: \"dcdefb66-0234-4aa7-97e4-bba6107a3e7d\") " pod="openstack/neutron-db-sync-9zn5l" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.070144 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf5tp\" (UniqueName: \"kubernetes.io/projected/8b162675-ea0a-4433-a94a-1f5bd6c81e01-kube-api-access-kf5tp\") pod \"placement-db-sync-n7q5f\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.070169 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-config-data\") pod \"placement-db-sync-n7q5f\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.070844 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-dns-svc\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.080087 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-ovsdbserver-sb\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.082479 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-n7q5f"] Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.082565 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-dns-swift-storage-0\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.084946 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-ovsdbserver-nb\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.093851 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kctn7\" (UniqueName: \"kubernetes.io/projected/cb3343f8-ef08-4d66-a004-cf5b1044dded-kube-api-access-kctn7\") pod \"dnsmasq-dns-66567888d7-bl7rg\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.101393 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72a305c7-8afb-4b56-90b1-e071980fbcdd-db-sync-config-data\") pod \"barbican-db-sync-jmljq\" (UID: \"72a305c7-8afb-4b56-90b1-e071980fbcdd\") " pod="openstack/barbican-db-sync-jmljq" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.102000 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-combined-ca-bundle\") pod \"neutron-db-sync-9zn5l\" (UID: \"dcdefb66-0234-4aa7-97e4-bba6107a3e7d\") " pod="openstack/neutron-db-sync-9zn5l" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.103219 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr6c7\" (UniqueName: \"kubernetes.io/projected/72a305c7-8afb-4b56-90b1-e071980fbcdd-kube-api-access-qr6c7\") pod \"barbican-db-sync-jmljq\" (UID: \"72a305c7-8afb-4b56-90b1-e071980fbcdd\") " pod="openstack/barbican-db-sync-jmljq" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.135164 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-config\") pod \"neutron-db-sync-9zn5l\" (UID: \"dcdefb66-0234-4aa7-97e4-bba6107a3e7d\") " pod="openstack/neutron-db-sync-9zn5l" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.155333 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st28d\" (UniqueName: \"kubernetes.io/projected/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-kube-api-access-st28d\") pod \"neutron-db-sync-9zn5l\" (UID: \"dcdefb66-0234-4aa7-97e4-bba6107a3e7d\") " pod="openstack/neutron-db-sync-9zn5l" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.171745 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b162675-ea0a-4433-a94a-1f5bd6c81e01-logs\") pod \"placement-db-sync-n7q5f\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.171794 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-combined-ca-bundle\") pod \"placement-db-sync-n7q5f\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.171869 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf5tp\" (UniqueName: \"kubernetes.io/projected/8b162675-ea0a-4433-a94a-1f5bd6c81e01-kube-api-access-kf5tp\") pod \"placement-db-sync-n7q5f\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.171890 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-config-data\") pod \"placement-db-sync-n7q5f\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.171944 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-scripts\") pod \"placement-db-sync-n7q5f\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.173099 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b162675-ea0a-4433-a94a-1f5bd6c81e01-logs\") pod \"placement-db-sync-n7q5f\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.179271 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.183573 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-scripts\") pod \"placement-db-sync-n7q5f\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.183723 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-combined-ca-bundle\") pod \"placement-db-sync-n7q5f\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.184200 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-config-data\") pod \"placement-db-sync-n7q5f\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.188310 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a305c7-8afb-4b56-90b1-e071980fbcdd-combined-ca-bundle\") pod \"barbican-db-sync-jmljq\" (UID: \"72a305c7-8afb-4b56-90b1-e071980fbcdd\") " pod="openstack/barbican-db-sync-jmljq" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.236870 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9zn5l" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.237280 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.237956 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mj74p" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.247689 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf5tp\" (UniqueName: \"kubernetes.io/projected/8b162675-ea0a-4433-a94a-1f5bd6c81e01-kube-api-access-kf5tp\") pod \"placement-db-sync-n7q5f\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.274413 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-ovsdbserver-sb\") pod \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.274972 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-config\") pod \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.275064 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-ovsdbserver-nb\") pod \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.275107 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-dns-swift-storage-0\") pod \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.275209 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-dns-svc\") pod \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.275293 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qs56\" (UniqueName: \"kubernetes.io/projected/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-kube-api-access-8qs56\") pod \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\" (UID: \"d4142b9d-e631-4eda-8c14-dda2ca6c59bd\") " Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.286283 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-kube-api-access-8qs56" (OuterVolumeSpecName: "kube-api-access-8qs56") pod "d4142b9d-e631-4eda-8c14-dda2ca6c59bd" (UID: "d4142b9d-e631-4eda-8c14-dda2ca6c59bd"). InnerVolumeSpecName "kube-api-access-8qs56". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.287575 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n7q5f" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.377280 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qs56\" (UniqueName: \"kubernetes.io/projected/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-kube-api-access-8qs56\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.400481 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d4142b9d-e631-4eda-8c14-dda2ca6c59bd" (UID: "d4142b9d-e631-4eda-8c14-dda2ca6c59bd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.434326 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.445086 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d4142b9d-e631-4eda-8c14-dda2ca6c59bd" (UID: "d4142b9d-e631-4eda-8c14-dda2ca6c59bd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:34 crc kubenswrapper[4745]: E1209 11:53:34.446375 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4142b9d-e631-4eda-8c14-dda2ca6c59bd" containerName="dnsmasq-dns" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.446402 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4142b9d-e631-4eda-8c14-dda2ca6c59bd" containerName="dnsmasq-dns" Dec 09 11:53:34 crc kubenswrapper[4745]: E1209 11:53:34.446426 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4142b9d-e631-4eda-8c14-dda2ca6c59bd" containerName="init" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.446437 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4142b9d-e631-4eda-8c14-dda2ca6c59bd" containerName="init" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.446686 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4142b9d-e631-4eda-8c14-dda2ca6c59bd" containerName="dnsmasq-dns" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.448559 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.451845 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.461331 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jmljq" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.467708 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.476011 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d4142b9d-e631-4eda-8c14-dda2ca6c59bd" (UID: "d4142b9d-e631-4eda-8c14-dda2ca6c59bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.478708 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.478750 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.478765 4745 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.497758 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d4142b9d-e631-4eda-8c14-dda2ca6c59bd" (UID: "d4142b9d-e631-4eda-8c14-dda2ca6c59bd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.498244 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.499696 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vchrd" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.502129 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.572911 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-config" (OuterVolumeSpecName: "config") pod "d4142b9d-e631-4eda-8c14-dda2ca6c59bd" (UID: "d4142b9d-e631-4eda-8c14-dda2ca6c59bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.580314 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-config-data\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.580401 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-logs\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.580471 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-scripts\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.580499 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.580549 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.580616 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpxhz\" (UniqueName: \"kubernetes.io/projected/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-kube-api-access-tpxhz\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.580711 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.580739 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.580813 4745 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.580832 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4142b9d-e631-4eda-8c14-dda2ca6c59bd-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.653940 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.655534 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.666097 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.666365 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.671675 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.682871 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-scripts\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.683092 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.683179 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.683331 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpxhz\" (UniqueName: \"kubernetes.io/projected/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-kube-api-access-tpxhz\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.683479 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.696787 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.697429 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-config-data\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.697906 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-logs\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.685863 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.685530 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.702328 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-logs\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.707352 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.714982 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.715037 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6974cb66c7-c6kll"] Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.715370 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpxhz\" (UniqueName: \"kubernetes.io/projected/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-kube-api-access-tpxhz\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.716304 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-config-data\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.728184 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-scripts\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.782838 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.830205 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.833182 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.833245 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c19bb44e-0486-4793-94cb-2b0cf646c219-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.833323 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c19bb44e-0486-4793-94cb-2b0cf646c219-logs\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.833385 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.833463 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.834367 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.834438 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgtdl\" (UniqueName: \"kubernetes.io/projected/c19bb44e-0486-4793-94cb-2b0cf646c219-kube-api-access-dgtdl\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: W1209 11:53:34.860774 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacbf1f60_7a6b_473b_8f36_00bc8bfffa3f.slice/crio-c08a53bfb57599cc3decea29fe159b55c3b631086570e4eb8258ef5955ebfbea WatchSource:0}: Error finding container c08a53bfb57599cc3decea29fe159b55c3b631086570e4eb8258ef5955ebfbea: Status 404 returned error can't find the container with id c08a53bfb57599cc3decea29fe159b55c3b631086570e4eb8258ef5955ebfbea Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.872118 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7dpkp"] Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.893291 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.944079 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" event={"ID":"25175de5-83bb-442d-b1ff-92f119310608","Type":"ContainerStarted","Data":"1206a88b664f99122e33f4e26596d5e898c4eefeac48a11fb97bf3081bfc2fb4"} Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.954614 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7dpkp" event={"ID":"c64c8ae5-485f-49b0-b232-fb97516a2ad2","Type":"ContainerStarted","Data":"b80cfece781f78e0d78d4f03d36fd54d83b57b94bd04c55991a2d4ef3a9535ef"} Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.958014 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.958069 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgtdl\" (UniqueName: \"kubernetes.io/projected/c19bb44e-0486-4793-94cb-2b0cf646c219-kube-api-access-dgtdl\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.958126 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.958156 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.958176 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c19bb44e-0486-4793-94cb-2b0cf646c219-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.958202 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c19bb44e-0486-4793-94cb-2b0cf646c219-logs\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.958228 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.958262 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.958919 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.959343 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c19bb44e-0486-4793-94cb-2b0cf646c219-logs\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.959397 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c19bb44e-0486-4793-94cb-2b0cf646c219-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.964071 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.978107 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.978345 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbc684849-rh6zl" event={"ID":"d4142b9d-e631-4eda-8c14-dda2ca6c59bd","Type":"ContainerDied","Data":"9f01241ec9826f6407a46e22f8eb5467c9fee09b8a3369276ffed529939a964a"} Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.978496 4745 scope.go:117] "RemoveContainer" containerID="f5d78e571eff9679a138beaedc658c3b16d4a65ed2a33e6ee698e05f4b907049" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.979603 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.980716 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.997587 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:53:34 crc kubenswrapper[4745]: I1209 11:53:34.999347 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f","Type":"ContainerStarted","Data":"c08a53bfb57599cc3decea29fe159b55c3b631086570e4eb8258ef5955ebfbea"} Dec 09 11:53:35 crc kubenswrapper[4745]: I1209 11:53:35.001041 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:35 crc kubenswrapper[4745]: I1209 11:53:35.002329 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgtdl\" (UniqueName: \"kubernetes.io/projected/c19bb44e-0486-4793-94cb-2b0cf646c219-kube-api-access-dgtdl\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:35 crc kubenswrapper[4745]: I1209 11:53:35.015238 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9zn5l"] Dec 09 11:53:35 crc kubenswrapper[4745]: I1209 11:53:35.036784 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dbc684849-rh6zl"] Dec 09 11:53:35 crc kubenswrapper[4745]: I1209 11:53:35.040445 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:35 crc kubenswrapper[4745]: I1209 11:53:35.056561 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dbc684849-rh6zl"] Dec 09 11:53:35 crc kubenswrapper[4745]: I1209 11:53:35.103332 4745 scope.go:117] "RemoveContainer" containerID="a5dca1900d951ebe9b682fb2cb2c4e9b00a2246df30edfba6bfc7058fb8c77d7" Dec 09 11:53:35 crc kubenswrapper[4745]: I1209 11:53:35.124667 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66567888d7-bl7rg"] Dec 09 11:53:35 crc kubenswrapper[4745]: I1209 11:53:35.229940 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jmljq"] Dec 09 11:53:35 crc kubenswrapper[4745]: I1209 11:53:35.246892 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-n7q5f"] Dec 09 11:53:35 crc kubenswrapper[4745]: I1209 11:53:35.273119 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mj74p"] Dec 09 11:53:35 crc kubenswrapper[4745]: W1209 11:53:35.290007 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa1920d0_1725_4be7_baa4_e6561fcce10c.slice/crio-26e060578597ab66f50b2d1ccb28c1c793a1af477d8d18bdc5ec92b81bdc3adb WatchSource:0}: Error finding container 26e060578597ab66f50b2d1ccb28c1c793a1af477d8d18bdc5ec92b81bdc3adb: Status 404 returned error can't find the container with id 26e060578597ab66f50b2d1ccb28c1c793a1af477d8d18bdc5ec92b81bdc3adb Dec 09 11:53:35 crc kubenswrapper[4745]: I1209 11:53:35.323152 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:53:35 crc kubenswrapper[4745]: I1209 11:53:35.578022 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4142b9d-e631-4eda-8c14-dda2ca6c59bd" path="/var/lib/kubelet/pods/d4142b9d-e631-4eda-8c14-dda2ca6c59bd/volumes" Dec 09 11:53:35 crc kubenswrapper[4745]: W1209 11:53:35.867344 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30f9c21f_ff7c_4f5c_a8e8_9e6ba5f824f0.slice/crio-c30108a69c6d5e4ffbb821b33631e751ae1885bdee0900ac46483f8ab382b90f WatchSource:0}: Error finding container c30108a69c6d5e4ffbb821b33631e751ae1885bdee0900ac46483f8ab382b90f: Status 404 returned error can't find the container with id c30108a69c6d5e4ffbb821b33631e751ae1885bdee0900ac46483f8ab382b90f Dec 09 11:53:35 crc kubenswrapper[4745]: I1209 11:53:35.885800 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.040824 4745 generic.go:334] "Generic (PLEG): container finished" podID="25175de5-83bb-442d-b1ff-92f119310608" containerID="19b7cf9325e16d0a7cdcb44795b0eb76e82e59659ee8d3f7058eb2ceda043974" exitCode=0 Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.041078 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" event={"ID":"25175de5-83bb-442d-b1ff-92f119310608","Type":"ContainerDied","Data":"19b7cf9325e16d0a7cdcb44795b0eb76e82e59659ee8d3f7058eb2ceda043974"} Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.043804 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7dpkp" event={"ID":"c64c8ae5-485f-49b0-b232-fb97516a2ad2","Type":"ContainerStarted","Data":"220c02beca474e41673d6888c903ca349368d7873a469bd6de0ce7dd666e8614"} Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.050631 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mj74p" event={"ID":"aa1920d0-1725-4be7-baa4-e6561fcce10c","Type":"ContainerStarted","Data":"26e060578597ab66f50b2d1ccb28c1c793a1af477d8d18bdc5ec92b81bdc3adb"} Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.059918 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jmljq" event={"ID":"72a305c7-8afb-4b56-90b1-e071980fbcdd","Type":"ContainerStarted","Data":"a26e3c93faab4711cf03b9f8270bd7cbb96a37036761f7a143df979d059f23d8"} Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.067869 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.070565 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9zn5l" event={"ID":"dcdefb66-0234-4aa7-97e4-bba6107a3e7d","Type":"ContainerStarted","Data":"3e03148e4090f73a824b767293e49cf1ab1feefbb029d6cb1a8a2eacdd6d58e1"} Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.070607 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9zn5l" event={"ID":"dcdefb66-0234-4aa7-97e4-bba6107a3e7d","Type":"ContainerStarted","Data":"542636681e78021afa3052f516acccb13cafa50aa8b6f2986540aff2cf061da6"} Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.087368 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n7q5f" event={"ID":"8b162675-ea0a-4433-a94a-1f5bd6c81e01","Type":"ContainerStarted","Data":"81a89f289f0d5f11aef8ce5acc3a8d75da0fb7dac953050c2db1686a4255c8b2"} Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.094435 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7dpkp" podStartSLOduration=3.09441472 podStartE2EDuration="3.09441472s" podCreationTimestamp="2025-12-09 11:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:53:36.08775517 +0000 UTC m=+1302.912956694" watchObservedRunningTime="2025-12-09 11:53:36.09441472 +0000 UTC m=+1302.919616244" Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.119168 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-9zn5l" podStartSLOduration=3.119151296 podStartE2EDuration="3.119151296s" podCreationTimestamp="2025-12-09 11:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:53:36.117940873 +0000 UTC m=+1302.943142407" watchObservedRunningTime="2025-12-09 11:53:36.119151296 +0000 UTC m=+1302.944352820" Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.123238 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0","Type":"ContainerStarted","Data":"c30108a69c6d5e4ffbb821b33631e751ae1885bdee0900ac46483f8ab382b90f"} Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.133950 4745 generic.go:334] "Generic (PLEG): container finished" podID="cb3343f8-ef08-4d66-a004-cf5b1044dded" containerID="9b695dd7f2628ab4074c4b49199ee571b429219d71ac7aa99b2e90bee6a995ea" exitCode=0 Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.134004 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66567888d7-bl7rg" event={"ID":"cb3343f8-ef08-4d66-a004-cf5b1044dded","Type":"ContainerDied","Data":"9b695dd7f2628ab4074c4b49199ee571b429219d71ac7aa99b2e90bee6a995ea"} Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.134033 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66567888d7-bl7rg" event={"ID":"cb3343f8-ef08-4d66-a004-cf5b1044dded","Type":"ContainerStarted","Data":"f14e3180bacdba17dc91c7878f52c51c5059c2ac3f72840bd7bed8c90da09e13"} Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.599408 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.676525 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-dns-svc\") pod \"25175de5-83bb-442d-b1ff-92f119310608\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.677113 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-config\") pod \"25175de5-83bb-442d-b1ff-92f119310608\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.677175 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-dns-swift-storage-0\") pod \"25175de5-83bb-442d-b1ff-92f119310608\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.677236 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-ovsdbserver-sb\") pod \"25175de5-83bb-442d-b1ff-92f119310608\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.677455 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt2hk\" (UniqueName: \"kubernetes.io/projected/25175de5-83bb-442d-b1ff-92f119310608-kube-api-access-vt2hk\") pod \"25175de5-83bb-442d-b1ff-92f119310608\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.677487 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-ovsdbserver-nb\") pod \"25175de5-83bb-442d-b1ff-92f119310608\" (UID: \"25175de5-83bb-442d-b1ff-92f119310608\") " Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.685841 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25175de5-83bb-442d-b1ff-92f119310608-kube-api-access-vt2hk" (OuterVolumeSpecName: "kube-api-access-vt2hk") pod "25175de5-83bb-442d-b1ff-92f119310608" (UID: "25175de5-83bb-442d-b1ff-92f119310608"). InnerVolumeSpecName "kube-api-access-vt2hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.715370 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25175de5-83bb-442d-b1ff-92f119310608" (UID: "25175de5-83bb-442d-b1ff-92f119310608"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.724608 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "25175de5-83bb-442d-b1ff-92f119310608" (UID: "25175de5-83bb-442d-b1ff-92f119310608"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.730467 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-config" (OuterVolumeSpecName: "config") pod "25175de5-83bb-442d-b1ff-92f119310608" (UID: "25175de5-83bb-442d-b1ff-92f119310608"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.735244 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25175de5-83bb-442d-b1ff-92f119310608" (UID: "25175de5-83bb-442d-b1ff-92f119310608"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.735478 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25175de5-83bb-442d-b1ff-92f119310608" (UID: "25175de5-83bb-442d-b1ff-92f119310608"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.781181 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt2hk\" (UniqueName: \"kubernetes.io/projected/25175de5-83bb-442d-b1ff-92f119310608-kube-api-access-vt2hk\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.781253 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.781275 4745 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.781286 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.781296 4745 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:36 crc kubenswrapper[4745]: I1209 11:53:36.781306 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25175de5-83bb-442d-b1ff-92f119310608-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:37 crc kubenswrapper[4745]: I1209 11:53:37.021571 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:53:37 crc kubenswrapper[4745]: I1209 11:53:37.184361 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" event={"ID":"25175de5-83bb-442d-b1ff-92f119310608","Type":"ContainerDied","Data":"1206a88b664f99122e33f4e26596d5e898c4eefeac48a11fb97bf3081bfc2fb4"} Dec 09 11:53:37 crc kubenswrapper[4745]: I1209 11:53:37.184433 4745 scope.go:117] "RemoveContainer" containerID="19b7cf9325e16d0a7cdcb44795b0eb76e82e59659ee8d3f7058eb2ceda043974" Dec 09 11:53:37 crc kubenswrapper[4745]: I1209 11:53:37.184618 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6974cb66c7-c6kll" Dec 09 11:53:37 crc kubenswrapper[4745]: I1209 11:53:37.227663 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:53:37 crc kubenswrapper[4745]: I1209 11:53:37.235317 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c19bb44e-0486-4793-94cb-2b0cf646c219","Type":"ContainerStarted","Data":"45c86c27500b85769f0bc0d2faeecb3c6e718d0fd9f51629f57846e4ebf0ac1d"} Dec 09 11:53:37 crc kubenswrapper[4745]: I1209 11:53:37.242846 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:53:37 crc kubenswrapper[4745]: I1209 11:53:37.486802 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0","Type":"ContainerStarted","Data":"88cedf9f96e1985c3236b7e018f24e86e7a957d8e27b94da5f1c739471af5f63"} Dec 09 11:53:37 crc kubenswrapper[4745]: I1209 11:53:37.527060 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66567888d7-bl7rg" event={"ID":"cb3343f8-ef08-4d66-a004-cf5b1044dded","Type":"ContainerStarted","Data":"8869a49aa69d605b4e05eb848339e645af9ddb24aed8f91b34855b9f996f246c"} Dec 09 11:53:37 crc kubenswrapper[4745]: I1209 11:53:37.527511 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:37 crc kubenswrapper[4745]: I1209 11:53:37.597115 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66567888d7-bl7rg" podStartSLOduration=4.597081341 podStartE2EDuration="4.597081341s" podCreationTimestamp="2025-12-09 11:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:53:37.562484519 +0000 UTC m=+1304.387686043" watchObservedRunningTime="2025-12-09 11:53:37.597081341 +0000 UTC m=+1304.422282865" Dec 09 11:53:37 crc kubenswrapper[4745]: I1209 11:53:37.613375 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6974cb66c7-c6kll"] Dec 09 11:53:37 crc kubenswrapper[4745]: I1209 11:53:37.613436 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6974cb66c7-c6kll"] Dec 09 11:53:38 crc kubenswrapper[4745]: I1209 11:53:38.545721 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c19bb44e-0486-4793-94cb-2b0cf646c219","Type":"ContainerStarted","Data":"fb367427c862f0d2d064e6570c5b3e1c0719b4b61822c83482725558c0ff602d"} Dec 09 11:53:39 crc kubenswrapper[4745]: I1209 11:53:39.561004 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c19bb44e-0486-4793-94cb-2b0cf646c219" containerName="glance-log" containerID="cri-o://fb367427c862f0d2d064e6570c5b3e1c0719b4b61822c83482725558c0ff602d" gracePeriod=30 Dec 09 11:53:39 crc kubenswrapper[4745]: I1209 11:53:39.561137 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c19bb44e-0486-4793-94cb-2b0cf646c219" containerName="glance-httpd" containerID="cri-o://3cf81610e33436dc6d2376c12aae16221defc58f627dfd67889139deec397123" gracePeriod=30 Dec 09 11:53:39 crc kubenswrapper[4745]: I1209 11:53:39.568325 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" containerName="glance-log" containerID="cri-o://88cedf9f96e1985c3236b7e018f24e86e7a957d8e27b94da5f1c739471af5f63" gracePeriod=30 Dec 09 11:53:39 crc kubenswrapper[4745]: I1209 11:53:39.568403 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" containerName="glance-httpd" containerID="cri-o://e451589f11f069ecce6b8c32faed56ed4e792c4e545fb97c88f9ce7e41095a15" gracePeriod=30 Dec 09 11:53:39 crc kubenswrapper[4745]: I1209 11:53:39.573244 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25175de5-83bb-442d-b1ff-92f119310608" path="/var/lib/kubelet/pods/25175de5-83bb-442d-b1ff-92f119310608/volumes" Dec 09 11:53:39 crc kubenswrapper[4745]: I1209 11:53:39.574505 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c19bb44e-0486-4793-94cb-2b0cf646c219","Type":"ContainerStarted","Data":"3cf81610e33436dc6d2376c12aae16221defc58f627dfd67889139deec397123"} Dec 09 11:53:39 crc kubenswrapper[4745]: I1209 11:53:39.574696 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0","Type":"ContainerStarted","Data":"e451589f11f069ecce6b8c32faed56ed4e792c4e545fb97c88f9ce7e41095a15"} Dec 09 11:53:39 crc kubenswrapper[4745]: I1209 11:53:39.597065 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.5970438940000005 podStartE2EDuration="6.597043894s" podCreationTimestamp="2025-12-09 11:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:53:39.591229247 +0000 UTC m=+1306.416430771" watchObservedRunningTime="2025-12-09 11:53:39.597043894 +0000 UTC m=+1306.422245418" Dec 09 11:53:39 crc kubenswrapper[4745]: I1209 11:53:39.623761 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.623732332 podStartE2EDuration="6.623732332s" podCreationTimestamp="2025-12-09 11:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:53:39.619477378 +0000 UTC m=+1306.444678902" watchObservedRunningTime="2025-12-09 11:53:39.623732332 +0000 UTC m=+1306.448933856" Dec 09 11:53:40 crc kubenswrapper[4745]: I1209 11:53:40.583263 4745 generic.go:334] "Generic (PLEG): container finished" podID="c19bb44e-0486-4793-94cb-2b0cf646c219" containerID="3cf81610e33436dc6d2376c12aae16221defc58f627dfd67889139deec397123" exitCode=143 Dec 09 11:53:40 crc kubenswrapper[4745]: I1209 11:53:40.583560 4745 generic.go:334] "Generic (PLEG): container finished" podID="c19bb44e-0486-4793-94cb-2b0cf646c219" containerID="fb367427c862f0d2d064e6570c5b3e1c0719b4b61822c83482725558c0ff602d" exitCode=143 Dec 09 11:53:40 crc kubenswrapper[4745]: I1209 11:53:40.583415 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c19bb44e-0486-4793-94cb-2b0cf646c219","Type":"ContainerDied","Data":"3cf81610e33436dc6d2376c12aae16221defc58f627dfd67889139deec397123"} Dec 09 11:53:40 crc kubenswrapper[4745]: I1209 11:53:40.583639 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c19bb44e-0486-4793-94cb-2b0cf646c219","Type":"ContainerDied","Data":"fb367427c862f0d2d064e6570c5b3e1c0719b4b61822c83482725558c0ff602d"} Dec 09 11:53:40 crc kubenswrapper[4745]: I1209 11:53:40.586286 4745 generic.go:334] "Generic (PLEG): container finished" podID="30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" containerID="e451589f11f069ecce6b8c32faed56ed4e792c4e545fb97c88f9ce7e41095a15" exitCode=143 Dec 09 11:53:40 crc kubenswrapper[4745]: I1209 11:53:40.586318 4745 generic.go:334] "Generic (PLEG): container finished" podID="30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" containerID="88cedf9f96e1985c3236b7e018f24e86e7a957d8e27b94da5f1c739471af5f63" exitCode=143 Dec 09 11:53:40 crc kubenswrapper[4745]: I1209 11:53:40.586318 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0","Type":"ContainerDied","Data":"e451589f11f069ecce6b8c32faed56ed4e792c4e545fb97c88f9ce7e41095a15"} Dec 09 11:53:40 crc kubenswrapper[4745]: I1209 11:53:40.586381 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0","Type":"ContainerDied","Data":"88cedf9f96e1985c3236b7e018f24e86e7a957d8e27b94da5f1c739471af5f63"} Dec 09 11:53:43 crc kubenswrapper[4745]: E1209 11:53:43.278875 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod311eddda_625a_4029_ba56_b408b2242eb5.slice/crio-ff9a5a6e802cf0168d9d01d325c9e7da33cbb8d5f307314d40c85aa0596d29c7\": RecentStats: unable to find data in memory cache]" Dec 09 11:53:43 crc kubenswrapper[4745]: I1209 11:53:43.639816 4745 generic.go:334] "Generic (PLEG): container finished" podID="c64c8ae5-485f-49b0-b232-fb97516a2ad2" containerID="220c02beca474e41673d6888c903ca349368d7873a469bd6de0ce7dd666e8614" exitCode=0 Dec 09 11:53:43 crc kubenswrapper[4745]: I1209 11:53:43.640579 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7dpkp" event={"ID":"c64c8ae5-485f-49b0-b232-fb97516a2ad2","Type":"ContainerDied","Data":"220c02beca474e41673d6888c903ca349368d7873a469bd6de0ce7dd666e8614"} Dec 09 11:53:44 crc kubenswrapper[4745]: I1209 11:53:44.239710 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:53:44 crc kubenswrapper[4745]: I1209 11:53:44.301093 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bdffd66f-chqjr"] Dec 09 11:53:44 crc kubenswrapper[4745]: I1209 11:53:44.301331 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" podUID="484908d8-e648-4fa4-954b-490f2f06ebb6" containerName="dnsmasq-dns" containerID="cri-o://a3201869945e2890e32a0e743f4f04aa015cf684636a1287e3cdf9f3358fc160" gracePeriod=10 Dec 09 11:53:44 crc kubenswrapper[4745]: I1209 11:53:44.658322 4745 generic.go:334] "Generic (PLEG): container finished" podID="484908d8-e648-4fa4-954b-490f2f06ebb6" containerID="a3201869945e2890e32a0e743f4f04aa015cf684636a1287e3cdf9f3358fc160" exitCode=0 Dec 09 11:53:44 crc kubenswrapper[4745]: I1209 11:53:44.658533 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" event={"ID":"484908d8-e648-4fa4-954b-490f2f06ebb6","Type":"ContainerDied","Data":"a3201869945e2890e32a0e743f4f04aa015cf684636a1287e3cdf9f3358fc160"} Dec 09 11:53:45 crc kubenswrapper[4745]: I1209 11:53:45.972304 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" podUID="484908d8-e648-4fa4-954b-490f2f06ebb6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: connect: connection refused" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.197160 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.218971 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.241172 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.345713 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-fernet-keys\") pod \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.345772 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c19bb44e-0486-4793-94cb-2b0cf646c219-httpd-run\") pod \"c19bb44e-0486-4793-94cb-2b0cf646c219\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.345824 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpxhz\" (UniqueName: \"kubernetes.io/projected/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-kube-api-access-tpxhz\") pod \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.345878 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hhnh\" (UniqueName: \"kubernetes.io/projected/c64c8ae5-485f-49b0-b232-fb97516a2ad2-kube-api-access-2hhnh\") pod \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.345989 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"c19bb44e-0486-4793-94cb-2b0cf646c219\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346022 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-combined-ca-bundle\") pod \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346049 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-scripts\") pod \"c19bb44e-0486-4793-94cb-2b0cf646c219\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346133 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c19bb44e-0486-4793-94cb-2b0cf646c219-logs\") pod \"c19bb44e-0486-4793-94cb-2b0cf646c219\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346182 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-combined-ca-bundle\") pod \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346218 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-config-data\") pod \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346243 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-credential-keys\") pod \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346309 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-config-data\") pod \"c19bb44e-0486-4793-94cb-2b0cf646c219\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346345 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgtdl\" (UniqueName: \"kubernetes.io/projected/c19bb44e-0486-4793-94cb-2b0cf646c219-kube-api-access-dgtdl\") pod \"c19bb44e-0486-4793-94cb-2b0cf646c219\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346373 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346402 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-public-tls-certs\") pod \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346425 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-scripts\") pod \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\" (UID: \"c64c8ae5-485f-49b0-b232-fb97516a2ad2\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346454 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-internal-tls-certs\") pod \"c19bb44e-0486-4793-94cb-2b0cf646c219\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346481 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-httpd-run\") pod \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346541 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-combined-ca-bundle\") pod \"c19bb44e-0486-4793-94cb-2b0cf646c219\" (UID: \"c19bb44e-0486-4793-94cb-2b0cf646c219\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346572 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-config-data\") pod \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346602 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-logs\") pod \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346669 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-scripts\") pod \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\" (UID: \"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0\") " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.346891 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c19bb44e-0486-4793-94cb-2b0cf646c219-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c19bb44e-0486-4793-94cb-2b0cf646c219" (UID: "c19bb44e-0486-4793-94cb-2b0cf646c219"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.347347 4745 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c19bb44e-0486-4793-94cb-2b0cf646c219-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.347935 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c19bb44e-0486-4793-94cb-2b0cf646c219-logs" (OuterVolumeSpecName: "logs") pod "c19bb44e-0486-4793-94cb-2b0cf646c219" (UID: "c19bb44e-0486-4793-94cb-2b0cf646c219"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.348341 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" (UID: "30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.354974 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-scripts" (OuterVolumeSpecName: "scripts") pod "30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" (UID: "30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.355921 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-scripts" (OuterVolumeSpecName: "scripts") pod "c19bb44e-0486-4793-94cb-2b0cf646c219" (UID: "c19bb44e-0486-4793-94cb-2b0cf646c219"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.355929 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c19bb44e-0486-4793-94cb-2b0cf646c219-kube-api-access-dgtdl" (OuterVolumeSpecName: "kube-api-access-dgtdl") pod "c19bb44e-0486-4793-94cb-2b0cf646c219" (UID: "c19bb44e-0486-4793-94cb-2b0cf646c219"). InnerVolumeSpecName "kube-api-access-dgtdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.356231 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-kube-api-access-tpxhz" (OuterVolumeSpecName: "kube-api-access-tpxhz") pod "30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" (UID: "30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0"). InnerVolumeSpecName "kube-api-access-tpxhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.361689 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-logs" (OuterVolumeSpecName: "logs") pod "30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" (UID: "30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.363871 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "c19bb44e-0486-4793-94cb-2b0cf646c219" (UID: "c19bb44e-0486-4793-94cb-2b0cf646c219"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.369262 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c64c8ae5-485f-49b0-b232-fb97516a2ad2" (UID: "c64c8ae5-485f-49b0-b232-fb97516a2ad2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.369798 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c64c8ae5-485f-49b0-b232-fb97516a2ad2" (UID: "c64c8ae5-485f-49b0-b232-fb97516a2ad2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.369977 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-scripts" (OuterVolumeSpecName: "scripts") pod "c64c8ae5-485f-49b0-b232-fb97516a2ad2" (UID: "c64c8ae5-485f-49b0-b232-fb97516a2ad2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.403753 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c64c8ae5-485f-49b0-b232-fb97516a2ad2-kube-api-access-2hhnh" (OuterVolumeSpecName: "kube-api-access-2hhnh") pod "c64c8ae5-485f-49b0-b232-fb97516a2ad2" (UID: "c64c8ae5-485f-49b0-b232-fb97516a2ad2"). InnerVolumeSpecName "kube-api-access-2hhnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.404770 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" (UID: "30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.444712 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c64c8ae5-485f-49b0-b232-fb97516a2ad2" (UID: "c64c8ae5-485f-49b0-b232-fb97516a2ad2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.448777 4745 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.448883 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.448962 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c19bb44e-0486-4793-94cb-2b0cf646c219-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.449023 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.449079 4745 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.449136 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgtdl\" (UniqueName: \"kubernetes.io/projected/c19bb44e-0486-4793-94cb-2b0cf646c219-kube-api-access-dgtdl\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.449211 4745 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.449265 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.449323 4745 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.449376 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.449429 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.449478 4745 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.449545 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpxhz\" (UniqueName: \"kubernetes.io/projected/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-kube-api-access-tpxhz\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.449608 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hhnh\" (UniqueName: \"kubernetes.io/projected/c64c8ae5-485f-49b0-b232-fb97516a2ad2-kube-api-access-2hhnh\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.455770 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" (UID: "30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.489207 4745 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.490556 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" (UID: "30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.498814 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c19bb44e-0486-4793-94cb-2b0cf646c219" (UID: "c19bb44e-0486-4793-94cb-2b0cf646c219"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.513122 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-config-data" (OuterVolumeSpecName: "config-data") pod "c19bb44e-0486-4793-94cb-2b0cf646c219" (UID: "c19bb44e-0486-4793-94cb-2b0cf646c219"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.517668 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-config-data" (OuterVolumeSpecName: "config-data") pod "30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" (UID: "30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.533969 4745 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.541572 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-config-data" (OuterVolumeSpecName: "config-data") pod "c64c8ae5-485f-49b0-b232-fb97516a2ad2" (UID: "c64c8ae5-485f-49b0-b232-fb97516a2ad2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.542784 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c19bb44e-0486-4793-94cb-2b0cf646c219" (UID: "c19bb44e-0486-4793-94cb-2b0cf646c219"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.552083 4745 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.552113 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.552126 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c64c8ae5-485f-49b0-b232-fb97516a2ad2-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.552134 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.552145 4745 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.552153 4745 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.552161 4745 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.552169 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c19bb44e-0486-4793-94cb-2b0cf646c219-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.552177 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.702360 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7dpkp" event={"ID":"c64c8ae5-485f-49b0-b232-fb97516a2ad2","Type":"ContainerDied","Data":"b80cfece781f78e0d78d4f03d36fd54d83b57b94bd04c55991a2d4ef3a9535ef"} Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.702442 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b80cfece781f78e0d78d4f03d36fd54d83b57b94bd04c55991a2d4ef3a9535ef" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.702387 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7dpkp" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.705569 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c19bb44e-0486-4793-94cb-2b0cf646c219","Type":"ContainerDied","Data":"45c86c27500b85769f0bc0d2faeecb3c6e718d0fd9f51629f57846e4ebf0ac1d"} Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.705641 4745 scope.go:117] "RemoveContainer" containerID="3cf81610e33436dc6d2376c12aae16221defc58f627dfd67889139deec397123" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.705842 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.710866 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0","Type":"ContainerDied","Data":"c30108a69c6d5e4ffbb821b33631e751ae1885bdee0900ac46483f8ab382b90f"} Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.711001 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.821471 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.841310 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.857890 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.869889 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.888051 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:53:48 crc kubenswrapper[4745]: E1209 11:53:48.888676 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" containerName="glance-httpd" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.888705 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" containerName="glance-httpd" Dec 09 11:53:48 crc kubenswrapper[4745]: E1209 11:53:48.888725 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25175de5-83bb-442d-b1ff-92f119310608" containerName="init" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.888734 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="25175de5-83bb-442d-b1ff-92f119310608" containerName="init" Dec 09 11:53:48 crc kubenswrapper[4745]: E1209 11:53:48.888764 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c19bb44e-0486-4793-94cb-2b0cf646c219" containerName="glance-httpd" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.888774 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19bb44e-0486-4793-94cb-2b0cf646c219" containerName="glance-httpd" Dec 09 11:53:48 crc kubenswrapper[4745]: E1209 11:53:48.888789 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" containerName="glance-log" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.888814 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" containerName="glance-log" Dec 09 11:53:48 crc kubenswrapper[4745]: E1209 11:53:48.888838 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c19bb44e-0486-4793-94cb-2b0cf646c219" containerName="glance-log" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.888847 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19bb44e-0486-4793-94cb-2b0cf646c219" containerName="glance-log" Dec 09 11:53:48 crc kubenswrapper[4745]: E1209 11:53:48.888858 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c64c8ae5-485f-49b0-b232-fb97516a2ad2" containerName="keystone-bootstrap" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.888867 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c64c8ae5-485f-49b0-b232-fb97516a2ad2" containerName="keystone-bootstrap" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.889088 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c64c8ae5-485f-49b0-b232-fb97516a2ad2" containerName="keystone-bootstrap" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.889107 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="25175de5-83bb-442d-b1ff-92f119310608" containerName="init" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.889129 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c19bb44e-0486-4793-94cb-2b0cf646c219" containerName="glance-httpd" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.889146 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c19bb44e-0486-4793-94cb-2b0cf646c219" containerName="glance-log" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.889158 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" containerName="glance-httpd" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.889173 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" containerName="glance-log" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.890279 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.895452 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.895717 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.895850 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.896447 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vchrd" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.901621 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.904072 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.906727 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.907324 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.926589 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.936946 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.972223 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.972299 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.972334 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.972369 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nlbk\" (UniqueName: \"kubernetes.io/projected/d8d5a02f-3b73-4a98-8b02-4f150574674e-kube-api-access-7nlbk\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.972404 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4466c464-f51f-4227-94fc-e216c99fe969-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.972426 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.972450 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4466c464-f51f-4227-94fc-e216c99fe969-logs\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.972571 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-scripts\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.972594 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76p49\" (UniqueName: \"kubernetes.io/projected/4466c464-f51f-4227-94fc-e216c99fe969-kube-api-access-76p49\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.972613 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-config-data\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.972634 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.972674 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d8d5a02f-3b73-4a98-8b02-4f150574674e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.972701 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8d5a02f-3b73-4a98-8b02-4f150574674e-logs\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.972728 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.972760 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:48 crc kubenswrapper[4745]: I1209 11:53:48.972783 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.075143 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.075208 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nlbk\" (UniqueName: \"kubernetes.io/projected/d8d5a02f-3b73-4a98-8b02-4f150574674e-kube-api-access-7nlbk\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.075254 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4466c464-f51f-4227-94fc-e216c99fe969-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.075278 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.075302 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4466c464-f51f-4227-94fc-e216c99fe969-logs\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.075392 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-scripts\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.075426 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76p49\" (UniqueName: \"kubernetes.io/projected/4466c464-f51f-4227-94fc-e216c99fe969-kube-api-access-76p49\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.075448 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-config-data\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.075468 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.075501 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d8d5a02f-3b73-4a98-8b02-4f150574674e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.075545 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8d5a02f-3b73-4a98-8b02-4f150574674e-logs\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.075573 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.075617 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.075645 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.076234 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4466c464-f51f-4227-94fc-e216c99fe969-logs\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.076259 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8d5a02f-3b73-4a98-8b02-4f150574674e-logs\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.076383 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.076468 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4466c464-f51f-4227-94fc-e216c99fe969-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.076597 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d8d5a02f-3b73-4a98-8b02-4f150574674e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.078861 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.079001 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.080048 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.083774 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-scripts\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.084107 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.084683 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.086051 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.091198 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-config-data\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.091500 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.093260 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.098066 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.103038 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nlbk\" (UniqueName: \"kubernetes.io/projected/d8d5a02f-3b73-4a98-8b02-4f150574674e-kube-api-access-7nlbk\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.106527 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76p49\" (UniqueName: \"kubernetes.io/projected/4466c464-f51f-4227-94fc-e216c99fe969-kube-api-access-76p49\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.111243 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.118918 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.217353 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.230526 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.310581 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7dpkp"] Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.319673 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7dpkp"] Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.397736 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vxcd9"] Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.398900 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.401867 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.402086 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.406113 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.406384 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.407129 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fh2d2" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.412239 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vxcd9"] Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.487607 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gznqm\" (UniqueName: \"kubernetes.io/projected/9775e8b1-d49b-42eb-9941-5de54e89f465-kube-api-access-gznqm\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.487713 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-config-data\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.487883 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-credential-keys\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.488002 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-combined-ca-bundle\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.488053 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-scripts\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.488108 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-fernet-keys\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.572215 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0" path="/var/lib/kubelet/pods/30f9c21f-ff7c-4f5c-a8e8-9e6ba5f824f0/volumes" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.573583 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c19bb44e-0486-4793-94cb-2b0cf646c219" path="/var/lib/kubelet/pods/c19bb44e-0486-4793-94cb-2b0cf646c219/volumes" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.574162 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c64c8ae5-485f-49b0-b232-fb97516a2ad2" path="/var/lib/kubelet/pods/c64c8ae5-485f-49b0-b232-fb97516a2ad2/volumes" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.590336 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-combined-ca-bundle\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.590400 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-scripts\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.590439 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-fernet-keys\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.590640 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gznqm\" (UniqueName: \"kubernetes.io/projected/9775e8b1-d49b-42eb-9941-5de54e89f465-kube-api-access-gznqm\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.590724 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-config-data\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.590765 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-credential-keys\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.596331 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-credential-keys\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.596661 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-fernet-keys\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.597156 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-combined-ca-bundle\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.601237 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-scripts\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.601635 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-config-data\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.609181 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gznqm\" (UniqueName: \"kubernetes.io/projected/9775e8b1-d49b-42eb-9941-5de54e89f465-kube-api-access-gznqm\") pod \"keystone-bootstrap-vxcd9\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:49 crc kubenswrapper[4745]: I1209 11:53:49.773072 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:53:53 crc kubenswrapper[4745]: E1209 11:53:53.506790 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod311eddda_625a_4029_ba56_b408b2242eb5.slice/crio-ff9a5a6e802cf0168d9d01d325c9e7da33cbb8d5f307314d40c85aa0596d29c7\": RecentStats: unable to find data in memory cache]" Dec 09 11:53:55 crc kubenswrapper[4745]: I1209 11:53:55.475748 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:53:55 crc kubenswrapper[4745]: I1209 11:53:55.476134 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:53:55 crc kubenswrapper[4745]: I1209 11:53:55.974573 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" podUID="484908d8-e648-4fa4-954b-490f2f06ebb6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Dec 09 11:53:59 crc kubenswrapper[4745]: E1209 11:53:59.341668 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16" Dec 09 11:53:59 crc kubenswrapper[4745]: E1209 11:53:59.342838 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qr6c7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-jmljq_openstack(72a305c7-8afb-4b56-90b1-e071980fbcdd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:53:59 crc kubenswrapper[4745]: E1209 11:53:59.345383 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-jmljq" podUID="72a305c7-8afb-4b56-90b1-e071980fbcdd" Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.418772 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.599751 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-dns-swift-storage-0\") pod \"484908d8-e648-4fa4-954b-490f2f06ebb6\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.599808 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-ovsdbserver-sb\") pod \"484908d8-e648-4fa4-954b-490f2f06ebb6\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.599845 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnz46\" (UniqueName: \"kubernetes.io/projected/484908d8-e648-4fa4-954b-490f2f06ebb6-kube-api-access-pnz46\") pod \"484908d8-e648-4fa4-954b-490f2f06ebb6\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.599920 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-ovsdbserver-nb\") pod \"484908d8-e648-4fa4-954b-490f2f06ebb6\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.599986 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-config\") pod \"484908d8-e648-4fa4-954b-490f2f06ebb6\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.600057 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-dns-svc\") pod \"484908d8-e648-4fa4-954b-490f2f06ebb6\" (UID: \"484908d8-e648-4fa4-954b-490f2f06ebb6\") " Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.626985 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484908d8-e648-4fa4-954b-490f2f06ebb6-kube-api-access-pnz46" (OuterVolumeSpecName: "kube-api-access-pnz46") pod "484908d8-e648-4fa4-954b-490f2f06ebb6" (UID: "484908d8-e648-4fa4-954b-490f2f06ebb6"). InnerVolumeSpecName "kube-api-access-pnz46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.704640 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnz46\" (UniqueName: \"kubernetes.io/projected/484908d8-e648-4fa4-954b-490f2f06ebb6-kube-api-access-pnz46\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.741441 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "484908d8-e648-4fa4-954b-490f2f06ebb6" (UID: "484908d8-e648-4fa4-954b-490f2f06ebb6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.748576 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "484908d8-e648-4fa4-954b-490f2f06ebb6" (UID: "484908d8-e648-4fa4-954b-490f2f06ebb6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.751794 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-config" (OuterVolumeSpecName: "config") pod "484908d8-e648-4fa4-954b-490f2f06ebb6" (UID: "484908d8-e648-4fa4-954b-490f2f06ebb6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.761101 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "484908d8-e648-4fa4-954b-490f2f06ebb6" (UID: "484908d8-e648-4fa4-954b-490f2f06ebb6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.786794 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "484908d8-e648-4fa4-954b-490f2f06ebb6" (UID: "484908d8-e648-4fa4-954b-490f2f06ebb6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.807206 4745 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.807241 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.807251 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.807261 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.807270 4745 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/484908d8-e648-4fa4-954b-490f2f06ebb6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.831813 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.836916 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" event={"ID":"484908d8-e648-4fa4-954b-490f2f06ebb6","Type":"ContainerDied","Data":"2bced92cf5e31be9894291401d8805cfb36d58fc8fd755a9a68a522600166e53"} Dec 09 11:53:59 crc kubenswrapper[4745]: E1209 11:53:59.838919 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16\\\"\"" pod="openstack/barbican-db-sync-jmljq" podUID="72a305c7-8afb-4b56-90b1-e071980fbcdd" Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.894841 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bdffd66f-chqjr"] Dec 09 11:53:59 crc kubenswrapper[4745]: I1209 11:53:59.905590 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bdffd66f-chqjr"] Dec 09 11:54:00 crc kubenswrapper[4745]: E1209 11:54:00.969795 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49" Dec 09 11:54:00 crc kubenswrapper[4745]: E1209 11:54:00.970643 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tz425,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-mj74p_openstack(aa1920d0-1725-4be7-baa4-e6561fcce10c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 11:54:00 crc kubenswrapper[4745]: E1209 11:54:00.972029 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-mj74p" podUID="aa1920d0-1725-4be7-baa4-e6561fcce10c" Dec 09 11:54:00 crc kubenswrapper[4745]: I1209 11:54:00.975517 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75bdffd66f-chqjr" podUID="484908d8-e648-4fa4-954b-490f2f06ebb6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Dec 09 11:54:01 crc kubenswrapper[4745]: I1209 11:54:01.002448 4745 scope.go:117] "RemoveContainer" containerID="fb367427c862f0d2d064e6570c5b3e1c0719b4b61822c83482725558c0ff602d" Dec 09 11:54:01 crc kubenswrapper[4745]: I1209 11:54:01.161896 4745 scope.go:117] "RemoveContainer" containerID="e451589f11f069ecce6b8c32faed56ed4e792c4e545fb97c88f9ce7e41095a15" Dec 09 11:54:01 crc kubenswrapper[4745]: I1209 11:54:01.252966 4745 scope.go:117] "RemoveContainer" containerID="88cedf9f96e1985c3236b7e018f24e86e7a957d8e27b94da5f1c739471af5f63" Dec 09 11:54:01 crc kubenswrapper[4745]: I1209 11:54:01.287430 4745 scope.go:117] "RemoveContainer" containerID="a3201869945e2890e32a0e743f4f04aa015cf684636a1287e3cdf9f3358fc160" Dec 09 11:54:01 crc kubenswrapper[4745]: I1209 11:54:01.330456 4745 scope.go:117] "RemoveContainer" containerID="b28c2d59d8afd69dd3c2c32acf8a0fb3dae8874a168f38806487c53880473616" Dec 09 11:54:01 crc kubenswrapper[4745]: I1209 11:54:01.565973 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="484908d8-e648-4fa4-954b-490f2f06ebb6" path="/var/lib/kubelet/pods/484908d8-e648-4fa4-954b-490f2f06ebb6/volumes" Dec 09 11:54:01 crc kubenswrapper[4745]: I1209 11:54:01.710238 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:54:01 crc kubenswrapper[4745]: I1209 11:54:01.726423 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vxcd9"] Dec 09 11:54:01 crc kubenswrapper[4745]: I1209 11:54:01.736677 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 09 11:54:01 crc kubenswrapper[4745]: I1209 11:54:01.824760 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:54:01 crc kubenswrapper[4745]: W1209 11:54:01.841470 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8d5a02f_3b73_4a98_8b02_4f150574674e.slice/crio-02d10a5871ed8b2ebf24618c25900205b3f155855cd4743dd778b487cdcadee8 WatchSource:0}: Error finding container 02d10a5871ed8b2ebf24618c25900205b3f155855cd4743dd778b487cdcadee8: Status 404 returned error can't find the container with id 02d10a5871ed8b2ebf24618c25900205b3f155855cd4743dd778b487cdcadee8 Dec 09 11:54:01 crc kubenswrapper[4745]: I1209 11:54:01.864772 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vxcd9" event={"ID":"9775e8b1-d49b-42eb-9941-5de54e89f465","Type":"ContainerStarted","Data":"7bd1b573332a1112e8c84d3dacfe4fe24d870d36a595aa2ab579f66108a0e5e1"} Dec 09 11:54:01 crc kubenswrapper[4745]: I1209 11:54:01.886931 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d8d5a02f-3b73-4a98-8b02-4f150574674e","Type":"ContainerStarted","Data":"02d10a5871ed8b2ebf24618c25900205b3f155855cd4743dd778b487cdcadee8"} Dec 09 11:54:01 crc kubenswrapper[4745]: I1209 11:54:01.894853 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n7q5f" event={"ID":"8b162675-ea0a-4433-a94a-1f5bd6c81e01","Type":"ContainerStarted","Data":"483597a5196d57f3f4f001d30d0b02bc3b98e998a978add42e621138a1dbee7e"} Dec 09 11:54:01 crc kubenswrapper[4745]: I1209 11:54:01.902383 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f","Type":"ContainerStarted","Data":"98404fc4ea5196498e38574d8cff7c571312490bfef5bcd9d09ba1fd7edd741a"} Dec 09 11:54:01 crc kubenswrapper[4745]: I1209 11:54:01.926295 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4466c464-f51f-4227-94fc-e216c99fe969","Type":"ContainerStarted","Data":"95ef1d24e53155337ca1b234cf7e8a3d22309ee1aef2f7559b0bb0c49a7e7268"} Dec 09 11:54:01 crc kubenswrapper[4745]: E1209 11:54:01.929116 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49\\\"\"" pod="openstack/cinder-db-sync-mj74p" podUID="aa1920d0-1725-4be7-baa4-e6561fcce10c" Dec 09 11:54:01 crc kubenswrapper[4745]: I1209 11:54:01.935307 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-n7q5f" podStartSLOduration=3.265450233 podStartE2EDuration="28.935276895s" podCreationTimestamp="2025-12-09 11:53:33 +0000 UTC" firstStartedPulling="2025-12-09 11:53:35.279159067 +0000 UTC m=+1302.104360591" lastFinishedPulling="2025-12-09 11:54:00.948985729 +0000 UTC m=+1327.774187253" observedRunningTime="2025-12-09 11:54:01.923360644 +0000 UTC m=+1328.748562198" watchObservedRunningTime="2025-12-09 11:54:01.935276895 +0000 UTC m=+1328.760478419" Dec 09 11:54:02 crc kubenswrapper[4745]: I1209 11:54:02.962344 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4466c464-f51f-4227-94fc-e216c99fe969","Type":"ContainerStarted","Data":"dcd1005a8885bb5693574d21b05dd46594f5fdbacb6179fff6af636199980b0c"} Dec 09 11:54:02 crc kubenswrapper[4745]: I1209 11:54:02.969004 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d8d5a02f-3b73-4a98-8b02-4f150574674e","Type":"ContainerStarted","Data":"e8813e68ec5c04a5eca84f32495694cece28ddc0c160e62ff127bb4da6d2300f"} Dec 09 11:54:02 crc kubenswrapper[4745]: I1209 11:54:02.972110 4745 generic.go:334] "Generic (PLEG): container finished" podID="dcdefb66-0234-4aa7-97e4-bba6107a3e7d" containerID="3e03148e4090f73a824b767293e49cf1ab1feefbb029d6cb1a8a2eacdd6d58e1" exitCode=0 Dec 09 11:54:02 crc kubenswrapper[4745]: I1209 11:54:02.972172 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9zn5l" event={"ID":"dcdefb66-0234-4aa7-97e4-bba6107a3e7d","Type":"ContainerDied","Data":"3e03148e4090f73a824b767293e49cf1ab1feefbb029d6cb1a8a2eacdd6d58e1"} Dec 09 11:54:02 crc kubenswrapper[4745]: I1209 11:54:02.977933 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vxcd9" event={"ID":"9775e8b1-d49b-42eb-9941-5de54e89f465","Type":"ContainerStarted","Data":"c1d791100af6e66bae867dd47c4c3aa0a99fe52177c4a48d7d7605c2a1a340ad"} Dec 09 11:54:03 crc kubenswrapper[4745]: I1209 11:54:03.019013 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vxcd9" podStartSLOduration=14.018990326 podStartE2EDuration="14.018990326s" podCreationTimestamp="2025-12-09 11:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:54:03.016036157 +0000 UTC m=+1329.841237681" watchObservedRunningTime="2025-12-09 11:54:03.018990326 +0000 UTC m=+1329.844191840" Dec 09 11:54:03 crc kubenswrapper[4745]: I1209 11:54:03.990883 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4466c464-f51f-4227-94fc-e216c99fe969","Type":"ContainerStarted","Data":"5fe186d0b5f18fa35433ed92b6dadb74514c3896f1a7619b4b74694f5b3b2775"} Dec 09 11:54:04 crc kubenswrapper[4745]: I1209 11:54:04.007965 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d8d5a02f-3b73-4a98-8b02-4f150574674e","Type":"ContainerStarted","Data":"416bc8400e98805bbceaa9c931f9fca8fd243adefdbab5e6704f409c75c19e64"} Dec 09 11:54:04 crc kubenswrapper[4745]: I1209 11:54:04.011745 4745 generic.go:334] "Generic (PLEG): container finished" podID="8b162675-ea0a-4433-a94a-1f5bd6c81e01" containerID="483597a5196d57f3f4f001d30d0b02bc3b98e998a978add42e621138a1dbee7e" exitCode=0 Dec 09 11:54:04 crc kubenswrapper[4745]: I1209 11:54:04.011844 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n7q5f" event={"ID":"8b162675-ea0a-4433-a94a-1f5bd6c81e01","Type":"ContainerDied","Data":"483597a5196d57f3f4f001d30d0b02bc3b98e998a978add42e621138a1dbee7e"} Dec 09 11:54:04 crc kubenswrapper[4745]: I1209 11:54:04.038461 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f","Type":"ContainerStarted","Data":"ea10af9989f367cd48d7ef0a3dcd9b223668ea66498a7029b126d22456a6913b"} Dec 09 11:54:04 crc kubenswrapper[4745]: I1209 11:54:04.038870 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=16.038842828 podStartE2EDuration="16.038842828s" podCreationTimestamp="2025-12-09 11:53:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:54:04.038020636 +0000 UTC m=+1330.863222180" watchObservedRunningTime="2025-12-09 11:54:04.038842828 +0000 UTC m=+1330.864044352" Dec 09 11:54:04 crc kubenswrapper[4745]: I1209 11:54:04.099307 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=16.099268095 podStartE2EDuration="16.099268095s" podCreationTimestamp="2025-12-09 11:53:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:54:04.070212543 +0000 UTC m=+1330.895414067" watchObservedRunningTime="2025-12-09 11:54:04.099268095 +0000 UTC m=+1330.924469639" Dec 09 11:54:04 crc kubenswrapper[4745]: I1209 11:54:04.515586 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9zn5l" Dec 09 11:54:04 crc kubenswrapper[4745]: I1209 11:54:04.640974 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st28d\" (UniqueName: \"kubernetes.io/projected/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-kube-api-access-st28d\") pod \"dcdefb66-0234-4aa7-97e4-bba6107a3e7d\" (UID: \"dcdefb66-0234-4aa7-97e4-bba6107a3e7d\") " Dec 09 11:54:04 crc kubenswrapper[4745]: I1209 11:54:04.641042 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-config\") pod \"dcdefb66-0234-4aa7-97e4-bba6107a3e7d\" (UID: \"dcdefb66-0234-4aa7-97e4-bba6107a3e7d\") " Dec 09 11:54:04 crc kubenswrapper[4745]: I1209 11:54:04.641177 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-combined-ca-bundle\") pod \"dcdefb66-0234-4aa7-97e4-bba6107a3e7d\" (UID: \"dcdefb66-0234-4aa7-97e4-bba6107a3e7d\") " Dec 09 11:54:04 crc kubenswrapper[4745]: I1209 11:54:04.648916 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-kube-api-access-st28d" (OuterVolumeSpecName: "kube-api-access-st28d") pod "dcdefb66-0234-4aa7-97e4-bba6107a3e7d" (UID: "dcdefb66-0234-4aa7-97e4-bba6107a3e7d"). InnerVolumeSpecName "kube-api-access-st28d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:54:04 crc kubenswrapper[4745]: I1209 11:54:04.678972 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcdefb66-0234-4aa7-97e4-bba6107a3e7d" (UID: "dcdefb66-0234-4aa7-97e4-bba6107a3e7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:04 crc kubenswrapper[4745]: I1209 11:54:04.681641 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-config" (OuterVolumeSpecName: "config") pod "dcdefb66-0234-4aa7-97e4-bba6107a3e7d" (UID: "dcdefb66-0234-4aa7-97e4-bba6107a3e7d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:04 crc kubenswrapper[4745]: I1209 11:54:04.744225 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:04 crc kubenswrapper[4745]: I1209 11:54:04.744280 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:04 crc kubenswrapper[4745]: I1209 11:54:04.744295 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st28d\" (UniqueName: \"kubernetes.io/projected/dcdefb66-0234-4aa7-97e4-bba6107a3e7d-kube-api-access-st28d\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.053559 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9zn5l" event={"ID":"dcdefb66-0234-4aa7-97e4-bba6107a3e7d","Type":"ContainerDied","Data":"542636681e78021afa3052f516acccb13cafa50aa8b6f2986540aff2cf061da6"} Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.053669 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="542636681e78021afa3052f516acccb13cafa50aa8b6f2986540aff2cf061da6" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.053585 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9zn5l" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.256424 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bb67c87c9-9jxz8"] Dec 09 11:54:05 crc kubenswrapper[4745]: E1209 11:54:05.257026 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484908d8-e648-4fa4-954b-490f2f06ebb6" containerName="init" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.260761 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="484908d8-e648-4fa4-954b-490f2f06ebb6" containerName="init" Dec 09 11:54:05 crc kubenswrapper[4745]: E1209 11:54:05.260850 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484908d8-e648-4fa4-954b-490f2f06ebb6" containerName="dnsmasq-dns" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.260919 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="484908d8-e648-4fa4-954b-490f2f06ebb6" containerName="dnsmasq-dns" Dec 09 11:54:05 crc kubenswrapper[4745]: E1209 11:54:05.261042 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcdefb66-0234-4aa7-97e4-bba6107a3e7d" containerName="neutron-db-sync" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.261109 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcdefb66-0234-4aa7-97e4-bba6107a3e7d" containerName="neutron-db-sync" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.261552 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="484908d8-e648-4fa4-954b-490f2f06ebb6" containerName="dnsmasq-dns" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.261636 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcdefb66-0234-4aa7-97e4-bba6107a3e7d" containerName="neutron-db-sync" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.262821 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.294060 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bb67c87c9-9jxz8"] Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.346615 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b5f4bd8c8-nsssp"] Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.349221 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.356131 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-dns-svc\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.356231 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-ovsdbserver-nb\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.356267 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-ovsdbserver-sb\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.356788 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-config\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.356838 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-dns-swift-storage-0\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.356865 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb2d8\" (UniqueName: \"kubernetes.io/projected/d02ef196-0d33-4546-8670-bd7dbb54e9b1-kube-api-access-kb2d8\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.357547 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.357788 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hqlhg" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.358048 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.358170 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.402755 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b5f4bd8c8-nsssp"] Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.463046 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-ovndb-tls-certs\") pod \"neutron-b5f4bd8c8-nsssp\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.463127 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-ovsdbserver-nb\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.463159 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-httpd-config\") pod \"neutron-b5f4bd8c8-nsssp\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.463189 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-config\") pod \"neutron-b5f4bd8c8-nsssp\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.463223 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-ovsdbserver-sb\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.463282 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-config\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.463324 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-dns-swift-storage-0\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.463349 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb2d8\" (UniqueName: \"kubernetes.io/projected/d02ef196-0d33-4546-8670-bd7dbb54e9b1-kube-api-access-kb2d8\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.463377 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwdjm\" (UniqueName: \"kubernetes.io/projected/a956face-d831-4e78-b6f5-36fd5dd17d87-kube-api-access-xwdjm\") pod \"neutron-b5f4bd8c8-nsssp\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.463435 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-combined-ca-bundle\") pod \"neutron-b5f4bd8c8-nsssp\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.463460 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-dns-svc\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.464592 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-dns-svc\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.464678 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-ovsdbserver-sb\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.465038 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-ovsdbserver-nb\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.465153 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-config\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.465252 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-dns-swift-storage-0\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.508957 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb2d8\" (UniqueName: \"kubernetes.io/projected/d02ef196-0d33-4546-8670-bd7dbb54e9b1-kube-api-access-kb2d8\") pod \"dnsmasq-dns-7bb67c87c9-9jxz8\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.567826 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-ovndb-tls-certs\") pod \"neutron-b5f4bd8c8-nsssp\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.568434 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-httpd-config\") pod \"neutron-b5f4bd8c8-nsssp\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.568476 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-config\") pod \"neutron-b5f4bd8c8-nsssp\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.569094 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwdjm\" (UniqueName: \"kubernetes.io/projected/a956face-d831-4e78-b6f5-36fd5dd17d87-kube-api-access-xwdjm\") pod \"neutron-b5f4bd8c8-nsssp\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.569363 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-combined-ca-bundle\") pod \"neutron-b5f4bd8c8-nsssp\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.574740 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-config\") pod \"neutron-b5f4bd8c8-nsssp\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.576355 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-ovndb-tls-certs\") pod \"neutron-b5f4bd8c8-nsssp\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.582353 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-httpd-config\") pod \"neutron-b5f4bd8c8-nsssp\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.586490 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-combined-ca-bundle\") pod \"neutron-b5f4bd8c8-nsssp\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.595787 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwdjm\" (UniqueName: \"kubernetes.io/projected/a956face-d831-4e78-b6f5-36fd5dd17d87-kube-api-access-xwdjm\") pod \"neutron-b5f4bd8c8-nsssp\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.624185 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.699531 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.802013 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n7q5f" Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.990427 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-combined-ca-bundle\") pod \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.990884 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-config-data\") pod \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.991065 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf5tp\" (UniqueName: \"kubernetes.io/projected/8b162675-ea0a-4433-a94a-1f5bd6c81e01-kube-api-access-kf5tp\") pod \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.991115 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b162675-ea0a-4433-a94a-1f5bd6c81e01-logs\") pod \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " Dec 09 11:54:05 crc kubenswrapper[4745]: I1209 11:54:05.991247 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-scripts\") pod \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\" (UID: \"8b162675-ea0a-4433-a94a-1f5bd6c81e01\") " Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.000594 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b162675-ea0a-4433-a94a-1f5bd6c81e01-kube-api-access-kf5tp" (OuterVolumeSpecName: "kube-api-access-kf5tp") pod "8b162675-ea0a-4433-a94a-1f5bd6c81e01" (UID: "8b162675-ea0a-4433-a94a-1f5bd6c81e01"). InnerVolumeSpecName "kube-api-access-kf5tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.000964 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b162675-ea0a-4433-a94a-1f5bd6c81e01-logs" (OuterVolumeSpecName: "logs") pod "8b162675-ea0a-4433-a94a-1f5bd6c81e01" (UID: "8b162675-ea0a-4433-a94a-1f5bd6c81e01"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.009313 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-scripts" (OuterVolumeSpecName: "scripts") pod "8b162675-ea0a-4433-a94a-1f5bd6c81e01" (UID: "8b162675-ea0a-4433-a94a-1f5bd6c81e01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.024953 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b162675-ea0a-4433-a94a-1f5bd6c81e01" (UID: "8b162675-ea0a-4433-a94a-1f5bd6c81e01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.045615 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-config-data" (OuterVolumeSpecName: "config-data") pod "8b162675-ea0a-4433-a94a-1f5bd6c81e01" (UID: "8b162675-ea0a-4433-a94a-1f5bd6c81e01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.093836 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n7q5f" event={"ID":"8b162675-ea0a-4433-a94a-1f5bd6c81e01","Type":"ContainerDied","Data":"81a89f289f0d5f11aef8ce5acc3a8d75da0fb7dac953050c2db1686a4255c8b2"} Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.093901 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81a89f289f0d5f11aef8ce5acc3a8d75da0fb7dac953050c2db1686a4255c8b2" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.093993 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n7q5f" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.094891 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf5tp\" (UniqueName: \"kubernetes.io/projected/8b162675-ea0a-4433-a94a-1f5bd6c81e01-kube-api-access-kf5tp\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.094916 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b162675-ea0a-4433-a94a-1f5bd6c81e01-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.094926 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.094936 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.094944 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b162675-ea0a-4433-a94a-1f5bd6c81e01-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.096807 4745 generic.go:334] "Generic (PLEG): container finished" podID="9775e8b1-d49b-42eb-9941-5de54e89f465" containerID="c1d791100af6e66bae867dd47c4c3aa0a99fe52177c4a48d7d7605c2a1a340ad" exitCode=0 Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.096865 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vxcd9" event={"ID":"9775e8b1-d49b-42eb-9941-5de54e89f465","Type":"ContainerDied","Data":"c1d791100af6e66bae867dd47c4c3aa0a99fe52177c4a48d7d7605c2a1a340ad"} Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.285678 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7db6b497c6-z9vtr"] Dec 09 11:54:06 crc kubenswrapper[4745]: E1209 11:54:06.286190 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b162675-ea0a-4433-a94a-1f5bd6c81e01" containerName="placement-db-sync" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.286217 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b162675-ea0a-4433-a94a-1f5bd6c81e01" containerName="placement-db-sync" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.286519 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b162675-ea0a-4433-a94a-1f5bd6c81e01" containerName="placement-db-sync" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.287586 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.296349 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zvjnk" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.296714 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.296833 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.296931 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.297117 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.307093 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7db6b497c6-z9vtr"] Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.404057 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4btj2\" (UniqueName: \"kubernetes.io/projected/25eecd8f-8b17-4e74-b651-78948c627127-kube-api-access-4btj2\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.404134 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-config-data\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.404164 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-public-tls-certs\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.404190 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-scripts\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.404295 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-combined-ca-bundle\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.404328 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25eecd8f-8b17-4e74-b651-78948c627127-logs\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.404343 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-internal-tls-certs\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.431611 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bb67c87c9-9jxz8"] Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.489161 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b5f4bd8c8-nsssp"] Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.507113 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4btj2\" (UniqueName: \"kubernetes.io/projected/25eecd8f-8b17-4e74-b651-78948c627127-kube-api-access-4btj2\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.507403 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-config-data\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.507482 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-public-tls-certs\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.507578 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-scripts\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.507701 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-combined-ca-bundle\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.507806 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25eecd8f-8b17-4e74-b651-78948c627127-logs\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.507888 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-internal-tls-certs\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.513939 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25eecd8f-8b17-4e74-b651-78948c627127-logs\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.516596 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-internal-tls-certs\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.516966 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-config-data\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.518720 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-public-tls-certs\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.519189 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-scripts\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.519668 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-combined-ca-bundle\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.534386 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4btj2\" (UniqueName: \"kubernetes.io/projected/25eecd8f-8b17-4e74-b651-78948c627127-kube-api-access-4btj2\") pod \"placement-7db6b497c6-z9vtr\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:06 crc kubenswrapper[4745]: I1209 11:54:06.648523 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.094859 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b68fbfd5-bx5sx"] Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.102068 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.104707 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.104966 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.114666 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b68fbfd5-bx5sx"] Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.177789 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-internal-tls-certs\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.177862 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-public-tls-certs\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.177911 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr99t\" (UniqueName: \"kubernetes.io/projected/f069021c-4758-4a29-98a5-2952a693cef9-kube-api-access-jr99t\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.178074 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-combined-ca-bundle\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.178195 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-httpd-config\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.178242 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-config\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.178571 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-ovndb-tls-certs\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.218455 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.218520 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.231709 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.231761 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.280279 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.280795 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-internal-tls-certs\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.280850 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-public-tls-certs\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.280910 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr99t\" (UniqueName: \"kubernetes.io/projected/f069021c-4758-4a29-98a5-2952a693cef9-kube-api-access-jr99t\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.280968 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-combined-ca-bundle\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.280996 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-httpd-config\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.281018 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-config\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.281108 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-ovndb-tls-certs\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.286987 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.289108 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.290817 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-public-tls-certs\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.291560 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.292108 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-config\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.292199 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-httpd-config\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.295253 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-ovndb-tls-certs\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.299654 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-combined-ca-bundle\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.302370 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-internal-tls-certs\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.304757 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr99t\" (UniqueName: \"kubernetes.io/projected/f069021c-4758-4a29-98a5-2952a693cef9-kube-api-access-jr99t\") pod \"neutron-5b68fbfd5-bx5sx\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:09 crc kubenswrapper[4745]: I1209 11:54:09.430556 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:10 crc kubenswrapper[4745]: I1209 11:54:10.164174 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 11:54:10 crc kubenswrapper[4745]: I1209 11:54:10.164565 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 11:54:10 crc kubenswrapper[4745]: I1209 11:54:10.164583 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 11:54:10 crc kubenswrapper[4745]: I1209 11:54:10.164597 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.185014 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b5f4bd8c8-nsssp" event={"ID":"a956face-d831-4e78-b6f5-36fd5dd17d87","Type":"ContainerStarted","Data":"ca7f8020f033eb2997746c35bc767adbe0fa8151e1988ff93196c22ca7d1db8d"} Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.192949 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vxcd9" event={"ID":"9775e8b1-d49b-42eb-9941-5de54e89f465","Type":"ContainerDied","Data":"7bd1b573332a1112e8c84d3dacfe4fe24d870d36a595aa2ab579f66108a0e5e1"} Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.192999 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bd1b573332a1112e8c84d3dacfe4fe24d870d36a595aa2ab579f66108a0e5e1" Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.195704 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" event={"ID":"d02ef196-0d33-4546-8670-bd7dbb54e9b1","Type":"ContainerStarted","Data":"46eee2b3fbab85957218daf1c2aabe4bfefe38aaae84b0c5c39ac483db6b955f"} Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.303543 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.340466 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-fernet-keys\") pod \"9775e8b1-d49b-42eb-9941-5de54e89f465\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.340581 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-scripts\") pod \"9775e8b1-d49b-42eb-9941-5de54e89f465\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.340651 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-config-data\") pod \"9775e8b1-d49b-42eb-9941-5de54e89f465\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.340686 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-credential-keys\") pod \"9775e8b1-d49b-42eb-9941-5de54e89f465\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.340778 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gznqm\" (UniqueName: \"kubernetes.io/projected/9775e8b1-d49b-42eb-9941-5de54e89f465-kube-api-access-gznqm\") pod \"9775e8b1-d49b-42eb-9941-5de54e89f465\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.340824 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-combined-ca-bundle\") pod \"9775e8b1-d49b-42eb-9941-5de54e89f465\" (UID: \"9775e8b1-d49b-42eb-9941-5de54e89f465\") " Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.350181 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-scripts" (OuterVolumeSpecName: "scripts") pod "9775e8b1-d49b-42eb-9941-5de54e89f465" (UID: "9775e8b1-d49b-42eb-9941-5de54e89f465"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.370406 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9775e8b1-d49b-42eb-9941-5de54e89f465-kube-api-access-gznqm" (OuterVolumeSpecName: "kube-api-access-gznqm") pod "9775e8b1-d49b-42eb-9941-5de54e89f465" (UID: "9775e8b1-d49b-42eb-9941-5de54e89f465"). InnerVolumeSpecName "kube-api-access-gznqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.371728 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9775e8b1-d49b-42eb-9941-5de54e89f465" (UID: "9775e8b1-d49b-42eb-9941-5de54e89f465"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.407768 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9775e8b1-d49b-42eb-9941-5de54e89f465" (UID: "9775e8b1-d49b-42eb-9941-5de54e89f465"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.443932 4745 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.443991 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gznqm\" (UniqueName: \"kubernetes.io/projected/9775e8b1-d49b-42eb-9941-5de54e89f465-kube-api-access-gznqm\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.444011 4745 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.444023 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.485014 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7db6b497c6-z9vtr"] Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.710641 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-config-data" (OuterVolumeSpecName: "config-data") pod "9775e8b1-d49b-42eb-9941-5de54e89f465" (UID: "9775e8b1-d49b-42eb-9941-5de54e89f465"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.718024 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9775e8b1-d49b-42eb-9941-5de54e89f465" (UID: "9775e8b1-d49b-42eb-9941-5de54e89f465"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.738669 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b68fbfd5-bx5sx"] Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.787841 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:11 crc kubenswrapper[4745]: I1209 11:54:11.787874 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9775e8b1-d49b-42eb-9941-5de54e89f465-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.225811 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b5f4bd8c8-nsssp" event={"ID":"a956face-d831-4e78-b6f5-36fd5dd17d87","Type":"ContainerStarted","Data":"3cdceb99a9eec1767150325fcaa34295a923cc03b478f8870daf679e47f911c4"} Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.231630 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7db6b497c6-z9vtr" event={"ID":"25eecd8f-8b17-4e74-b651-78948c627127","Type":"ContainerStarted","Data":"c1072a6b7c6b1a8d03366bc7d0ddbbd40587331a7fd99d03c872e65df3f9946a"} Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.231660 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7db6b497c6-z9vtr" event={"ID":"25eecd8f-8b17-4e74-b651-78948c627127","Type":"ContainerStarted","Data":"92f474fd86deaf26a3ebed0f6f198818937e580e74a9f872d9b4abf33086e260"} Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.256159 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f","Type":"ContainerStarted","Data":"8348c879ee07ca3f7d84fb1ffb00d578ef657f44ba95d488301a8bbc08a8dd73"} Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.261149 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b68fbfd5-bx5sx" event={"ID":"f069021c-4758-4a29-98a5-2952a693cef9","Type":"ContainerStarted","Data":"90bead3ede055afdc07bafce2c445e1e313577dd1e987bb5e0e9306ae04c1227"} Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.266014 4745 generic.go:334] "Generic (PLEG): container finished" podID="d02ef196-0d33-4546-8670-bd7dbb54e9b1" containerID="9479b00344fe8ffbe4e7f32232a1f6cd2146c5dbcb30a375a14d294380268267" exitCode=0 Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.266106 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" event={"ID":"d02ef196-0d33-4546-8670-bd7dbb54e9b1","Type":"ContainerDied","Data":"9479b00344fe8ffbe4e7f32232a1f6cd2146c5dbcb30a375a14d294380268267"} Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.266149 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vxcd9" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.515604 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-56c9f47db8-wmdgz"] Dec 09 11:54:12 crc kubenswrapper[4745]: E1209 11:54:12.516143 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9775e8b1-d49b-42eb-9941-5de54e89f465" containerName="keystone-bootstrap" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.516165 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9775e8b1-d49b-42eb-9941-5de54e89f465" containerName="keystone-bootstrap" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.516348 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="9775e8b1-d49b-42eb-9941-5de54e89f465" containerName="keystone-bootstrap" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.517148 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.520885 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.521565 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fh2d2" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.521797 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.521816 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.522535 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.522549 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.534491 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56c9f47db8-wmdgz"] Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.607551 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-credential-keys\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.607640 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-internal-tls-certs\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.607746 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-fernet-keys\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.607824 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-config-data\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.607920 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-combined-ca-bundle\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.608098 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-public-tls-certs\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.608177 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-scripts\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.608252 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkswp\" (UniqueName: \"kubernetes.io/projected/ff59337d-f366-446b-9752-eb371ee468e4-kube-api-access-hkswp\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.710485 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-credential-keys\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.710598 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-internal-tls-certs\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.710632 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-fernet-keys\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.710663 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-config-data\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.710697 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-combined-ca-bundle\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.710754 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-public-tls-certs\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.710785 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-scripts\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.710808 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkswp\" (UniqueName: \"kubernetes.io/projected/ff59337d-f366-446b-9752-eb371ee468e4-kube-api-access-hkswp\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.718588 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-fernet-keys\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.718996 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-config-data\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.719177 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-public-tls-certs\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.719276 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-combined-ca-bundle\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.719296 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-internal-tls-certs\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.722820 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-scripts\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.725561 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-credential-keys\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.730100 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkswp\" (UniqueName: \"kubernetes.io/projected/ff59337d-f366-446b-9752-eb371ee468e4-kube-api-access-hkswp\") pod \"keystone-56c9f47db8-wmdgz\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.758123 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.760627 4745 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.840088 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.976162 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 11:54:12 crc kubenswrapper[4745]: I1209 11:54:12.976241 4745 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 11:54:13 crc kubenswrapper[4745]: I1209 11:54:13.016668 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 11:54:13 crc kubenswrapper[4745]: I1209 11:54:13.021119 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 11:54:13 crc kubenswrapper[4745]: I1209 11:54:13.389083 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" event={"ID":"d02ef196-0d33-4546-8670-bd7dbb54e9b1","Type":"ContainerStarted","Data":"c99f3570c494823f8b0bfc38c69a7643421d26c483cf943e0cedfc105b0577d1"} Dec 09 11:54:13 crc kubenswrapper[4745]: I1209 11:54:13.390881 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:13 crc kubenswrapper[4745]: I1209 11:54:13.419230 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b5f4bd8c8-nsssp" event={"ID":"a956face-d831-4e78-b6f5-36fd5dd17d87","Type":"ContainerStarted","Data":"0f03b2380ccee137ab05d37ab9697bd0cbf1866ed1b4e0b065bc8089e1995aa4"} Dec 09 11:54:13 crc kubenswrapper[4745]: I1209 11:54:13.419411 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:13 crc kubenswrapper[4745]: I1209 11:54:13.428982 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7db6b497c6-z9vtr" event={"ID":"25eecd8f-8b17-4e74-b651-78948c627127","Type":"ContainerStarted","Data":"7fbc4fcdff7364e87c5838ace4d2c29c49b120562cd1441f9f1df819403a2bde"} Dec 09 11:54:13 crc kubenswrapper[4745]: I1209 11:54:13.430324 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:13 crc kubenswrapper[4745]: I1209 11:54:13.430368 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:13 crc kubenswrapper[4745]: I1209 11:54:13.453649 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b68fbfd5-bx5sx" event={"ID":"f069021c-4758-4a29-98a5-2952a693cef9","Type":"ContainerStarted","Data":"366b1b96d54298d8ed7a757e883bb67d2c254c7379c342add2a12b68039fea8b"} Dec 09 11:54:13 crc kubenswrapper[4745]: I1209 11:54:13.459985 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" podStartSLOduration=8.459960176 podStartE2EDuration="8.459960176s" podCreationTimestamp="2025-12-09 11:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:54:13.445944069 +0000 UTC m=+1340.271145593" watchObservedRunningTime="2025-12-09 11:54:13.459960176 +0000 UTC m=+1340.285161700" Dec 09 11:54:13 crc kubenswrapper[4745]: I1209 11:54:13.473357 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7db6b497c6-z9vtr" podStartSLOduration=7.473337827 podStartE2EDuration="7.473337827s" podCreationTimestamp="2025-12-09 11:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:54:13.469595016 +0000 UTC m=+1340.294796550" watchObservedRunningTime="2025-12-09 11:54:13.473337827 +0000 UTC m=+1340.298539351" Dec 09 11:54:13 crc kubenswrapper[4745]: I1209 11:54:13.527624 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56c9f47db8-wmdgz"] Dec 09 11:54:13 crc kubenswrapper[4745]: I1209 11:54:13.530274 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b5f4bd8c8-nsssp" podStartSLOduration=8.530244659 podStartE2EDuration="8.530244659s" podCreationTimestamp="2025-12-09 11:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:54:13.503132159 +0000 UTC m=+1340.328333693" watchObservedRunningTime="2025-12-09 11:54:13.530244659 +0000 UTC m=+1340.355446183" Dec 09 11:54:14 crc kubenswrapper[4745]: I1209 11:54:14.464876 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56c9f47db8-wmdgz" event={"ID":"ff59337d-f366-446b-9752-eb371ee468e4","Type":"ContainerStarted","Data":"6ae9c39299fade370baa6f26261ff6c95df9c9b15cbf11b461365f6b474c6e83"} Dec 09 11:54:15 crc kubenswrapper[4745]: I1209 11:54:15.488314 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jmljq" event={"ID":"72a305c7-8afb-4b56-90b1-e071980fbcdd","Type":"ContainerStarted","Data":"14f30d7116a31c94310abb53f8936d59624cb617dd766351680c7a5bb720b515"} Dec 09 11:54:15 crc kubenswrapper[4745]: I1209 11:54:15.492329 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b68fbfd5-bx5sx" event={"ID":"f069021c-4758-4a29-98a5-2952a693cef9","Type":"ContainerStarted","Data":"3c97425c86c2cd129c5f56303b9b231be4792489fd433e4c00bc829ef51d297c"} Dec 09 11:54:16 crc kubenswrapper[4745]: I1209 11:54:16.504770 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56c9f47db8-wmdgz" event={"ID":"ff59337d-f366-446b-9752-eb371ee468e4","Type":"ContainerStarted","Data":"e26931e675e3e053e3c67af648d2433105486188c6e62356a6e8b58996656df4"} Dec 09 11:54:16 crc kubenswrapper[4745]: I1209 11:54:16.505652 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:16 crc kubenswrapper[4745]: I1209 11:54:16.508826 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mj74p" event={"ID":"aa1920d0-1725-4be7-baa4-e6561fcce10c","Type":"ContainerStarted","Data":"47f1559d229361b96fe0ec92e8bff823067257e88ef088fa96fb0e9ea34b38bc"} Dec 09 11:54:16 crc kubenswrapper[4745]: I1209 11:54:16.509116 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:16 crc kubenswrapper[4745]: I1209 11:54:16.532199 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-56c9f47db8-wmdgz" podStartSLOduration=4.532175591 podStartE2EDuration="4.532175591s" podCreationTimestamp="2025-12-09 11:54:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:54:16.528608665 +0000 UTC m=+1343.353810209" watchObservedRunningTime="2025-12-09 11:54:16.532175591 +0000 UTC m=+1343.357377115" Dec 09 11:54:16 crc kubenswrapper[4745]: I1209 11:54:16.534746 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-jmljq" podStartSLOduration=5.752485242 podStartE2EDuration="43.534730969s" podCreationTimestamp="2025-12-09 11:53:33 +0000 UTC" firstStartedPulling="2025-12-09 11:53:35.260549586 +0000 UTC m=+1302.085751110" lastFinishedPulling="2025-12-09 11:54:13.042795323 +0000 UTC m=+1339.867996837" observedRunningTime="2025-12-09 11:54:15.517544271 +0000 UTC m=+1342.342745795" watchObservedRunningTime="2025-12-09 11:54:16.534730969 +0000 UTC m=+1343.359932493" Dec 09 11:54:16 crc kubenswrapper[4745]: I1209 11:54:16.567078 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b68fbfd5-bx5sx" podStartSLOduration=7.56705459 podStartE2EDuration="7.56705459s" podCreationTimestamp="2025-12-09 11:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:54:16.562901958 +0000 UTC m=+1343.388103482" watchObservedRunningTime="2025-12-09 11:54:16.56705459 +0000 UTC m=+1343.392256114" Dec 09 11:54:16 crc kubenswrapper[4745]: I1209 11:54:16.584913 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-mj74p" podStartSLOduration=3.111367896 podStartE2EDuration="43.58488558s" podCreationTimestamp="2025-12-09 11:53:33 +0000 UTC" firstStartedPulling="2025-12-09 11:53:35.325833514 +0000 UTC m=+1302.151035038" lastFinishedPulling="2025-12-09 11:54:15.799351198 +0000 UTC m=+1342.624552722" observedRunningTime="2025-12-09 11:54:16.581144619 +0000 UTC m=+1343.406346143" watchObservedRunningTime="2025-12-09 11:54:16.58488558 +0000 UTC m=+1343.410087104" Dec 09 11:54:18 crc kubenswrapper[4745]: I1209 11:54:18.532601 4745 generic.go:334] "Generic (PLEG): container finished" podID="72a305c7-8afb-4b56-90b1-e071980fbcdd" containerID="14f30d7116a31c94310abb53f8936d59624cb617dd766351680c7a5bb720b515" exitCode=0 Dec 09 11:54:18 crc kubenswrapper[4745]: I1209 11:54:18.532714 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jmljq" event={"ID":"72a305c7-8afb-4b56-90b1-e071980fbcdd","Type":"ContainerDied","Data":"14f30d7116a31c94310abb53f8936d59624cb617dd766351680c7a5bb720b515"} Dec 09 11:54:20 crc kubenswrapper[4745]: I1209 11:54:20.626992 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:20 crc kubenswrapper[4745]: I1209 11:54:20.685051 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66567888d7-bl7rg"] Dec 09 11:54:20 crc kubenswrapper[4745]: I1209 11:54:20.685374 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66567888d7-bl7rg" podUID="cb3343f8-ef08-4d66-a004-cf5b1044dded" containerName="dnsmasq-dns" containerID="cri-o://8869a49aa69d605b4e05eb848339e645af9ddb24aed8f91b34855b9f996f246c" gracePeriod=10 Dec 09 11:54:21 crc kubenswrapper[4745]: I1209 11:54:21.582920 4745 generic.go:334] "Generic (PLEG): container finished" podID="cb3343f8-ef08-4d66-a004-cf5b1044dded" containerID="8869a49aa69d605b4e05eb848339e645af9ddb24aed8f91b34855b9f996f246c" exitCode=0 Dec 09 11:54:21 crc kubenswrapper[4745]: I1209 11:54:21.582998 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66567888d7-bl7rg" event={"ID":"cb3343f8-ef08-4d66-a004-cf5b1044dded","Type":"ContainerDied","Data":"8869a49aa69d605b4e05eb848339e645af9ddb24aed8f91b34855b9f996f246c"} Dec 09 11:54:21 crc kubenswrapper[4745]: I1209 11:54:21.590710 4745 generic.go:334] "Generic (PLEG): container finished" podID="aa1920d0-1725-4be7-baa4-e6561fcce10c" containerID="47f1559d229361b96fe0ec92e8bff823067257e88ef088fa96fb0e9ea34b38bc" exitCode=0 Dec 09 11:54:21 crc kubenswrapper[4745]: I1209 11:54:21.590762 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mj74p" event={"ID":"aa1920d0-1725-4be7-baa4-e6561fcce10c","Type":"ContainerDied","Data":"47f1559d229361b96fe0ec92e8bff823067257e88ef088fa96fb0e9ea34b38bc"} Dec 09 11:54:21 crc kubenswrapper[4745]: I1209 11:54:21.924895 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jmljq" Dec 09 11:54:22 crc kubenswrapper[4745]: I1209 11:54:22.033674 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr6c7\" (UniqueName: \"kubernetes.io/projected/72a305c7-8afb-4b56-90b1-e071980fbcdd-kube-api-access-qr6c7\") pod \"72a305c7-8afb-4b56-90b1-e071980fbcdd\" (UID: \"72a305c7-8afb-4b56-90b1-e071980fbcdd\") " Dec 09 11:54:22 crc kubenswrapper[4745]: I1209 11:54:22.033793 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a305c7-8afb-4b56-90b1-e071980fbcdd-combined-ca-bundle\") pod \"72a305c7-8afb-4b56-90b1-e071980fbcdd\" (UID: \"72a305c7-8afb-4b56-90b1-e071980fbcdd\") " Dec 09 11:54:22 crc kubenswrapper[4745]: I1209 11:54:22.033930 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72a305c7-8afb-4b56-90b1-e071980fbcdd-db-sync-config-data\") pod \"72a305c7-8afb-4b56-90b1-e071980fbcdd\" (UID: \"72a305c7-8afb-4b56-90b1-e071980fbcdd\") " Dec 09 11:54:22 crc kubenswrapper[4745]: I1209 11:54:22.042922 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a305c7-8afb-4b56-90b1-e071980fbcdd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "72a305c7-8afb-4b56-90b1-e071980fbcdd" (UID: "72a305c7-8afb-4b56-90b1-e071980fbcdd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:22 crc kubenswrapper[4745]: I1209 11:54:22.062490 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a305c7-8afb-4b56-90b1-e071980fbcdd-kube-api-access-qr6c7" (OuterVolumeSpecName: "kube-api-access-qr6c7") pod "72a305c7-8afb-4b56-90b1-e071980fbcdd" (UID: "72a305c7-8afb-4b56-90b1-e071980fbcdd"). InnerVolumeSpecName "kube-api-access-qr6c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:54:22 crc kubenswrapper[4745]: I1209 11:54:22.099567 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a305c7-8afb-4b56-90b1-e071980fbcdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72a305c7-8afb-4b56-90b1-e071980fbcdd" (UID: "72a305c7-8afb-4b56-90b1-e071980fbcdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:22 crc kubenswrapper[4745]: I1209 11:54:22.135943 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr6c7\" (UniqueName: \"kubernetes.io/projected/72a305c7-8afb-4b56-90b1-e071980fbcdd-kube-api-access-qr6c7\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:22 crc kubenswrapper[4745]: I1209 11:54:22.135981 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a305c7-8afb-4b56-90b1-e071980fbcdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:22 crc kubenswrapper[4745]: I1209 11:54:22.135995 4745 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72a305c7-8afb-4b56-90b1-e071980fbcdd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:22 crc kubenswrapper[4745]: I1209 11:54:22.603804 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jmljq" Dec 09 11:54:22 crc kubenswrapper[4745]: I1209 11:54:22.603920 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jmljq" event={"ID":"72a305c7-8afb-4b56-90b1-e071980fbcdd","Type":"ContainerDied","Data":"a26e3c93faab4711cf03b9f8270bd7cbb96a37036761f7a143df979d059f23d8"} Dec 09 11:54:22 crc kubenswrapper[4745]: I1209 11:54:22.604362 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a26e3c93faab4711cf03b9f8270bd7cbb96a37036761f7a143df979d059f23d8" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.207723 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7cf679bcc-g65zr"] Dec 09 11:54:23 crc kubenswrapper[4745]: E1209 11:54:23.208273 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a305c7-8afb-4b56-90b1-e071980fbcdd" containerName="barbican-db-sync" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.208291 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a305c7-8afb-4b56-90b1-e071980fbcdd" containerName="barbican-db-sync" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.208630 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a305c7-8afb-4b56-90b1-e071980fbcdd" containerName="barbican-db-sync" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.209888 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.214635 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.214867 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4n696" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.214890 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.232038 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-55565d45c6-5hsz5"] Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.234006 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.239224 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.269690 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-55565d45c6-5hsz5"] Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.352044 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7cf679bcc-g65zr"] Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.373456 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-config-data\") pod \"barbican-worker-7cf679bcc-g65zr\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.373556 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-config-data-custom\") pod \"barbican-worker-7cf679bcc-g65zr\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.373669 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ggwb\" (UniqueName: \"kubernetes.io/projected/4658eac8-46b0-448b-8bc7-7c783fcef1c6-kube-api-access-2ggwb\") pod \"barbican-keystone-listener-55565d45c6-5hsz5\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.373851 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da94483d-f361-42ef-95b4-d4b2c79b4d80-logs\") pod \"barbican-worker-7cf679bcc-g65zr\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.373921 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4658eac8-46b0-448b-8bc7-7c783fcef1c6-logs\") pod \"barbican-keystone-listener-55565d45c6-5hsz5\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.373987 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-config-data-custom\") pod \"barbican-keystone-listener-55565d45c6-5hsz5\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.374026 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-combined-ca-bundle\") pod \"barbican-worker-7cf679bcc-g65zr\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.374207 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-config-data\") pod \"barbican-keystone-listener-55565d45c6-5hsz5\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.374337 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-combined-ca-bundle\") pod \"barbican-keystone-listener-55565d45c6-5hsz5\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.375257 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scr5f\" (UniqueName: \"kubernetes.io/projected/da94483d-f361-42ef-95b4-d4b2c79b4d80-kube-api-access-scr5f\") pod \"barbican-worker-7cf679bcc-g65zr\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.387584 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54c4dfcffc-zcgsf"] Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.389786 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.397976 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54c4dfcffc-zcgsf"] Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.483795 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scr5f\" (UniqueName: \"kubernetes.io/projected/da94483d-f361-42ef-95b4-d4b2c79b4d80-kube-api-access-scr5f\") pod \"barbican-worker-7cf679bcc-g65zr\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.483898 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-config-data\") pod \"barbican-worker-7cf679bcc-g65zr\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.483929 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-config-data-custom\") pod \"barbican-worker-7cf679bcc-g65zr\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.484013 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ggwb\" (UniqueName: \"kubernetes.io/projected/4658eac8-46b0-448b-8bc7-7c783fcef1c6-kube-api-access-2ggwb\") pod \"barbican-keystone-listener-55565d45c6-5hsz5\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.484061 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da94483d-f361-42ef-95b4-d4b2c79b4d80-logs\") pod \"barbican-worker-7cf679bcc-g65zr\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.484088 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4658eac8-46b0-448b-8bc7-7c783fcef1c6-logs\") pod \"barbican-keystone-listener-55565d45c6-5hsz5\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.484148 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-config-data-custom\") pod \"barbican-keystone-listener-55565d45c6-5hsz5\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.484175 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-combined-ca-bundle\") pod \"barbican-worker-7cf679bcc-g65zr\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.484217 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-config-data\") pod \"barbican-keystone-listener-55565d45c6-5hsz5\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.484259 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-combined-ca-bundle\") pod \"barbican-keystone-listener-55565d45c6-5hsz5\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.485808 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4658eac8-46b0-448b-8bc7-7c783fcef1c6-logs\") pod \"barbican-keystone-listener-55565d45c6-5hsz5\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.486235 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da94483d-f361-42ef-95b4-d4b2c79b4d80-logs\") pod \"barbican-worker-7cf679bcc-g65zr\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.503350 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-config-data-custom\") pod \"barbican-keystone-listener-55565d45c6-5hsz5\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.504988 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-combined-ca-bundle\") pod \"barbican-worker-7cf679bcc-g65zr\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.507669 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-config-data-custom\") pod \"barbican-worker-7cf679bcc-g65zr\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.509419 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-config-data\") pod \"barbican-worker-7cf679bcc-g65zr\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.512303 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6765469b68-lmphn"] Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.512643 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-combined-ca-bundle\") pod \"barbican-keystone-listener-55565d45c6-5hsz5\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.514296 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scr5f\" (UniqueName: \"kubernetes.io/projected/da94483d-f361-42ef-95b4-d4b2c79b4d80-kube-api-access-scr5f\") pod \"barbican-worker-7cf679bcc-g65zr\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.514709 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.515500 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-config-data\") pod \"barbican-keystone-listener-55565d45c6-5hsz5\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.518855 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.522216 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ggwb\" (UniqueName: \"kubernetes.io/projected/4658eac8-46b0-448b-8bc7-7c783fcef1c6-kube-api-access-2ggwb\") pod \"barbican-keystone-listener-55565d45c6-5hsz5\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.531678 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6765469b68-lmphn"] Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.535596 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.559592 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.585545 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-dns-swift-storage-0\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.585609 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-ovsdbserver-sb\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.585638 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-ovsdbserver-nb\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.585695 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5clt\" (UniqueName: \"kubernetes.io/projected/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-kube-api-access-z5clt\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.585727 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-config\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.585764 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-dns-svc\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.613533 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.617657 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66567888d7-bl7rg" event={"ID":"cb3343f8-ef08-4d66-a004-cf5b1044dded","Type":"ContainerDied","Data":"f14e3180bacdba17dc91c7878f52c51c5059c2ac3f72840bd7bed8c90da09e13"} Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.617716 4745 scope.go:117] "RemoveContainer" containerID="8869a49aa69d605b4e05eb848339e645af9ddb24aed8f91b34855b9f996f246c" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.617683 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66567888d7-bl7rg" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.625474 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mj74p" event={"ID":"aa1920d0-1725-4be7-baa4-e6561fcce10c","Type":"ContainerDied","Data":"26e060578597ab66f50b2d1ccb28c1c793a1af477d8d18bdc5ec92b81bdc3adb"} Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.625560 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26e060578597ab66f50b2d1ccb28c1c793a1af477d8d18bdc5ec92b81bdc3adb" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.626097 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mj74p" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.687718 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5clt\" (UniqueName: \"kubernetes.io/projected/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-kube-api-access-z5clt\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.687776 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-config-data-custom\") pod \"barbican-api-6765469b68-lmphn\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.687799 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-config-data\") pod \"barbican-api-6765469b68-lmphn\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.687824 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-config\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.687894 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68kpc\" (UniqueName: \"kubernetes.io/projected/bd180b10-c922-4741-bf65-0a7c220e980d-kube-api-access-68kpc\") pod \"barbican-api-6765469b68-lmphn\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.687918 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-dns-svc\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.688007 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-dns-swift-storage-0\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.688059 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-ovsdbserver-sb\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.688086 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-ovsdbserver-nb\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.688122 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-combined-ca-bundle\") pod \"barbican-api-6765469b68-lmphn\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.688150 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd180b10-c922-4741-bf65-0a7c220e980d-logs\") pod \"barbican-api-6765469b68-lmphn\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.689972 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-config\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.692606 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-dns-svc\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.694469 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-dns-swift-storage-0\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.695055 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-ovsdbserver-sb\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.697353 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-ovsdbserver-nb\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.719599 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5clt\" (UniqueName: \"kubernetes.io/projected/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-kube-api-access-z5clt\") pod \"dnsmasq-dns-54c4dfcffc-zcgsf\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.789538 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-dns-svc\") pod \"cb3343f8-ef08-4d66-a004-cf5b1044dded\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.789827 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-ovsdbserver-nb\") pod \"cb3343f8-ef08-4d66-a004-cf5b1044dded\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.790019 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-combined-ca-bundle\") pod \"aa1920d0-1725-4be7-baa4-e6561fcce10c\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.790177 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-ovsdbserver-sb\") pod \"cb3343f8-ef08-4d66-a004-cf5b1044dded\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.790252 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-config\") pod \"cb3343f8-ef08-4d66-a004-cf5b1044dded\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.790360 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-scripts\") pod \"aa1920d0-1725-4be7-baa4-e6561fcce10c\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.790573 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-db-sync-config-data\") pod \"aa1920d0-1725-4be7-baa4-e6561fcce10c\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.790674 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-config-data\") pod \"aa1920d0-1725-4be7-baa4-e6561fcce10c\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.790753 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kctn7\" (UniqueName: \"kubernetes.io/projected/cb3343f8-ef08-4d66-a004-cf5b1044dded-kube-api-access-kctn7\") pod \"cb3343f8-ef08-4d66-a004-cf5b1044dded\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.790851 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-dns-swift-storage-0\") pod \"cb3343f8-ef08-4d66-a004-cf5b1044dded\" (UID: \"cb3343f8-ef08-4d66-a004-cf5b1044dded\") " Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.790938 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa1920d0-1725-4be7-baa4-e6561fcce10c-etc-machine-id\") pod \"aa1920d0-1725-4be7-baa4-e6561fcce10c\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.791024 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz425\" (UniqueName: \"kubernetes.io/projected/aa1920d0-1725-4be7-baa4-e6561fcce10c-kube-api-access-tz425\") pod \"aa1920d0-1725-4be7-baa4-e6561fcce10c\" (UID: \"aa1920d0-1725-4be7-baa4-e6561fcce10c\") " Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.791342 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68kpc\" (UniqueName: \"kubernetes.io/projected/bd180b10-c922-4741-bf65-0a7c220e980d-kube-api-access-68kpc\") pod \"barbican-api-6765469b68-lmphn\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.791629 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-combined-ca-bundle\") pod \"barbican-api-6765469b68-lmphn\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.791714 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd180b10-c922-4741-bf65-0a7c220e980d-logs\") pod \"barbican-api-6765469b68-lmphn\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.791852 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-config-data-custom\") pod \"barbican-api-6765469b68-lmphn\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.791931 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-config-data\") pod \"barbican-api-6765469b68-lmphn\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.796350 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa1920d0-1725-4be7-baa4-e6561fcce10c-kube-api-access-tz425" (OuterVolumeSpecName: "kube-api-access-tz425") pod "aa1920d0-1725-4be7-baa4-e6561fcce10c" (UID: "aa1920d0-1725-4be7-baa4-e6561fcce10c"). InnerVolumeSpecName "kube-api-access-tz425". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.797040 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa1920d0-1725-4be7-baa4-e6561fcce10c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "aa1920d0-1725-4be7-baa4-e6561fcce10c" (UID: "aa1920d0-1725-4be7-baa4-e6561fcce10c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.798406 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-scripts" (OuterVolumeSpecName: "scripts") pod "aa1920d0-1725-4be7-baa4-e6561fcce10c" (UID: "aa1920d0-1725-4be7-baa4-e6561fcce10c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.799167 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd180b10-c922-4741-bf65-0a7c220e980d-logs\") pod \"barbican-api-6765469b68-lmphn\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.803219 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-config-data\") pod \"barbican-api-6765469b68-lmphn\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.807322 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-config-data-custom\") pod \"barbican-api-6765469b68-lmphn\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.809764 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "aa1920d0-1725-4be7-baa4-e6561fcce10c" (UID: "aa1920d0-1725-4be7-baa4-e6561fcce10c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.813639 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb3343f8-ef08-4d66-a004-cf5b1044dded-kube-api-access-kctn7" (OuterVolumeSpecName: "kube-api-access-kctn7") pod "cb3343f8-ef08-4d66-a004-cf5b1044dded" (UID: "cb3343f8-ef08-4d66-a004-cf5b1044dded"). InnerVolumeSpecName "kube-api-access-kctn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.821247 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-combined-ca-bundle\") pod \"barbican-api-6765469b68-lmphn\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.822591 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68kpc\" (UniqueName: \"kubernetes.io/projected/bd180b10-c922-4741-bf65-0a7c220e980d-kube-api-access-68kpc\") pod \"barbican-api-6765469b68-lmphn\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.824327 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa1920d0-1725-4be7-baa4-e6561fcce10c" (UID: "aa1920d0-1725-4be7-baa4-e6561fcce10c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.853879 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cb3343f8-ef08-4d66-a004-cf5b1044dded" (UID: "cb3343f8-ef08-4d66-a004-cf5b1044dded"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.853926 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-config" (OuterVolumeSpecName: "config") pod "cb3343f8-ef08-4d66-a004-cf5b1044dded" (UID: "cb3343f8-ef08-4d66-a004-cf5b1044dded"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.857034 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-config-data" (OuterVolumeSpecName: "config-data") pod "aa1920d0-1725-4be7-baa4-e6561fcce10c" (UID: "aa1920d0-1725-4be7-baa4-e6561fcce10c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.865824 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cb3343f8-ef08-4d66-a004-cf5b1044dded" (UID: "cb3343f8-ef08-4d66-a004-cf5b1044dded"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.869467 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb3343f8-ef08-4d66-a004-cf5b1044dded" (UID: "cb3343f8-ef08-4d66-a004-cf5b1044dded"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.876337 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cb3343f8-ef08-4d66-a004-cf5b1044dded" (UID: "cb3343f8-ef08-4d66-a004-cf5b1044dded"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.893787 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.893829 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.893840 4745 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.893854 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.893864 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kctn7\" (UniqueName: \"kubernetes.io/projected/cb3343f8-ef08-4d66-a004-cf5b1044dded-kube-api-access-kctn7\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.893874 4745 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.893883 4745 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa1920d0-1725-4be7-baa4-e6561fcce10c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.893893 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz425\" (UniqueName: \"kubernetes.io/projected/aa1920d0-1725-4be7-baa4-e6561fcce10c-kube-api-access-tz425\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.893903 4745 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.893911 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.893919 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa1920d0-1725-4be7-baa4-e6561fcce10c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.893928 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb3343f8-ef08-4d66-a004-cf5b1044dded-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.951868 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.969704 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66567888d7-bl7rg"] Dec 09 11:54:23 crc kubenswrapper[4745]: I1209 11:54:23.979963 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66567888d7-bl7rg"] Dec 09 11:54:24 crc kubenswrapper[4745]: I1209 11:54:24.015787 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:24 crc kubenswrapper[4745]: I1209 11:54:24.180890 4745 scope.go:117] "RemoveContainer" containerID="9b695dd7f2628ab4074c4b49199ee571b429219d71ac7aa99b2e90bee6a995ea" Dec 09 11:54:24 crc kubenswrapper[4745]: I1209 11:54:24.638196 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mj74p" Dec 09 11:54:24 crc kubenswrapper[4745]: I1209 11:54:24.890702 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6765469b68-lmphn"] Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.019140 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:54:25 crc kubenswrapper[4745]: E1209 11:54:25.019642 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3343f8-ef08-4d66-a004-cf5b1044dded" containerName="init" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.019662 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3343f8-ef08-4d66-a004-cf5b1044dded" containerName="init" Dec 09 11:54:25 crc kubenswrapper[4745]: E1209 11:54:25.019690 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3343f8-ef08-4d66-a004-cf5b1044dded" containerName="dnsmasq-dns" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.019698 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3343f8-ef08-4d66-a004-cf5b1044dded" containerName="dnsmasq-dns" Dec 09 11:54:25 crc kubenswrapper[4745]: E1209 11:54:25.019734 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa1920d0-1725-4be7-baa4-e6561fcce10c" containerName="cinder-db-sync" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.019742 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa1920d0-1725-4be7-baa4-e6561fcce10c" containerName="cinder-db-sync" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.019935 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa1920d0-1725-4be7-baa4-e6561fcce10c" containerName="cinder-db-sync" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.019952 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb3343f8-ef08-4d66-a004-cf5b1044dded" containerName="dnsmasq-dns" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.021539 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.027268 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.033730 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.035525 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ql62n" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.035782 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.035835 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.043094 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54c4dfcffc-zcgsf"] Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.096265 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54c4dfcffc-zcgsf"] Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.119490 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7cf679bcc-g65zr"] Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.133369 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.133453 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfcjb\" (UniqueName: \"kubernetes.io/projected/7aed9859-5b8c-4a55-9b4d-b9264e59d114-kube-api-access-hfcjb\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.133486 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7aed9859-5b8c-4a55-9b4d-b9264e59d114-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.133554 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.133632 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-config-data\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.133699 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-scripts\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.152647 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-55565d45c6-5hsz5"] Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.182828 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b4f5fc4f-2c4hw"] Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.193045 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.201792 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b4f5fc4f-2c4hw"] Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.235464 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.235542 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfcjb\" (UniqueName: \"kubernetes.io/projected/7aed9859-5b8c-4a55-9b4d-b9264e59d114-kube-api-access-hfcjb\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.235562 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7aed9859-5b8c-4a55-9b4d-b9264e59d114-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.235602 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.235657 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-config-data\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.235701 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-scripts\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.236869 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7aed9859-5b8c-4a55-9b4d-b9264e59d114-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.244123 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-scripts\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.245835 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-config-data\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.250029 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.256780 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.264876 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfcjb\" (UniqueName: \"kubernetes.io/projected/7aed9859-5b8c-4a55-9b4d-b9264e59d114-kube-api-access-hfcjb\") pod \"cinder-scheduler-0\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.276682 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.278400 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.281213 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.288496 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.306844 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.338919 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-ovsdbserver-nb\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.339000 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-dns-svc\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.339032 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-ovsdbserver-sb\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.339063 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-dns-swift-storage-0\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.339095 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxtxq\" (UniqueName: \"kubernetes.io/projected/ad70250d-1392-4c1f-b661-09709cb9d7b0-kube-api-access-zxtxq\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.339176 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-config\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.441054 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76672101-82d9-4d42-b793-6fa33ce5c91a-logs\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.441112 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-scripts\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.441142 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/76672101-82d9-4d42-b793-6fa33ce5c91a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.441208 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-ovsdbserver-nb\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.441288 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx9jn\" (UniqueName: \"kubernetes.io/projected/76672101-82d9-4d42-b793-6fa33ce5c91a-kube-api-access-tx9jn\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.441323 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-dns-svc\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.441353 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-ovsdbserver-sb\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.441377 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-config-data\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.441397 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-dns-swift-storage-0\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.441427 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.441446 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-config-data-custom\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.441476 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxtxq\" (UniqueName: \"kubernetes.io/projected/ad70250d-1392-4c1f-b661-09709cb9d7b0-kube-api-access-zxtxq\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.442733 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-config\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.443089 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-dns-svc\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.443891 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-ovsdbserver-nb\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.443991 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-ovsdbserver-sb\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.444131 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-dns-swift-storage-0\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.444175 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-config\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.471804 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxtxq\" (UniqueName: \"kubernetes.io/projected/ad70250d-1392-4c1f-b661-09709cb9d7b0-kube-api-access-zxtxq\") pod \"dnsmasq-dns-6b4f5fc4f-2c4hw\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.475116 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.475164 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.475215 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.476606 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"254488b12cfc6e65d01192f108f9d8847d5257e1f8c39a968b3046b52ec176b8"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.476668 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://254488b12cfc6e65d01192f108f9d8847d5257e1f8c39a968b3046b52ec176b8" gracePeriod=600 Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.545180 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx9jn\" (UniqueName: \"kubernetes.io/projected/76672101-82d9-4d42-b793-6fa33ce5c91a-kube-api-access-tx9jn\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.545267 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-config-data\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.545301 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.545319 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-config-data-custom\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.545395 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76672101-82d9-4d42-b793-6fa33ce5c91a-logs\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.545411 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-scripts\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.545427 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/76672101-82d9-4d42-b793-6fa33ce5c91a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.545568 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/76672101-82d9-4d42-b793-6fa33ce5c91a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.547555 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76672101-82d9-4d42-b793-6fa33ce5c91a-logs\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.549671 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.551890 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-config-data\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.552365 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-config-data-custom\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.554373 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-scripts\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.566769 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx9jn\" (UniqueName: \"kubernetes.io/projected/76672101-82d9-4d42-b793-6fa33ce5c91a-kube-api-access-tx9jn\") pod \"cinder-api-0\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.572333 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb3343f8-ef08-4d66-a004-cf5b1044dded" path="/var/lib/kubelet/pods/cb3343f8-ef08-4d66-a004-cf5b1044dded/volumes" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.638597 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.659976 4745 generic.go:334] "Generic (PLEG): container finished" podID="2ee9d423-632a-4e0e-9a83-8746a1c3e13b" containerID="07d8f259aab1aac7ed324dba7bac6dd94fe2b2054a20a3ec1f2ff3ae2c2ac31e" exitCode=0 Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.660053 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" event={"ID":"2ee9d423-632a-4e0e-9a83-8746a1c3e13b","Type":"ContainerDied","Data":"07d8f259aab1aac7ed324dba7bac6dd94fe2b2054a20a3ec1f2ff3ae2c2ac31e"} Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.660085 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" event={"ID":"2ee9d423-632a-4e0e-9a83-8746a1c3e13b","Type":"ContainerStarted","Data":"2193b76aaeab7d8e277cfcdd236a657a121888a80eb0fa59531ab10583211e82"} Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.682795 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6765469b68-lmphn" event={"ID":"bd180b10-c922-4741-bf65-0a7c220e980d","Type":"ContainerStarted","Data":"9907a5e0df1ef3b89d42441ae0bbd28fb681911524522f3e45d9dfedc04a6905"} Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.683381 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6765469b68-lmphn" event={"ID":"bd180b10-c922-4741-bf65-0a7c220e980d","Type":"ContainerStarted","Data":"851c659be8b6d5ce723578bba8d035d480792b31ca537a70d2092073886089da"} Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.686445 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.693692 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" event={"ID":"4658eac8-46b0-448b-8bc7-7c783fcef1c6","Type":"ContainerStarted","Data":"16d59b177c3820f47e3ee14727a18e7325b983db2fb60b44ac0a1caa054b9d35"} Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.708402 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f","Type":"ContainerStarted","Data":"fd7788b125194e4e4a263fa965e205af8f8904f04815be5bca118a21310532f7"} Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.708699 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerName="ceilometer-central-agent" containerID="cri-o://98404fc4ea5196498e38574d8cff7c571312490bfef5bcd9d09ba1fd7edd741a" gracePeriod=30 Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.709080 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.709083 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerName="proxy-httpd" containerID="cri-o://fd7788b125194e4e4a263fa965e205af8f8904f04815be5bca118a21310532f7" gracePeriod=30 Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.709104 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerName="sg-core" containerID="cri-o://8348c879ee07ca3f7d84fb1ffb00d578ef657f44ba95d488301a8bbc08a8dd73" gracePeriod=30 Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.709191 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerName="ceilometer-notification-agent" containerID="cri-o://ea10af9989f367cd48d7ef0a3dcd9b223668ea66498a7029b126d22456a6913b" gracePeriod=30 Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.740305 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cf679bcc-g65zr" event={"ID":"da94483d-f361-42ef-95b4-d4b2c79b4d80","Type":"ContainerStarted","Data":"c018599f5942c8fc41469f65fb994bad0f7170c4d04647d6ff4ec90a9a80d783"} Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.756264 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="254488b12cfc6e65d01192f108f9d8847d5257e1f8c39a968b3046b52ec176b8" exitCode=0 Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.756311 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"254488b12cfc6e65d01192f108f9d8847d5257e1f8c39a968b3046b52ec176b8"} Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.756347 4745 scope.go:117] "RemoveContainer" containerID="ae7429356c582a02b4d2b13febbec9379c05cf9bbc3bd01266102d2aeec55a48" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.762993 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.298523646 podStartE2EDuration="52.762957655s" podCreationTimestamp="2025-12-09 11:53:33 +0000 UTC" firstStartedPulling="2025-12-09 11:53:34.906200295 +0000 UTC m=+1301.731401819" lastFinishedPulling="2025-12-09 11:54:24.370634304 +0000 UTC m=+1351.195835828" observedRunningTime="2025-12-09 11:54:25.755004941 +0000 UTC m=+1352.580206465" watchObservedRunningTime="2025-12-09 11:54:25.762957655 +0000 UTC m=+1352.588159169" Dec 09 11:54:25 crc kubenswrapper[4745]: I1209 11:54:25.832331 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.107276 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.264272 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-config\") pod \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.264577 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-ovsdbserver-sb\") pod \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.264658 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-dns-svc\") pod \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.264728 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-ovsdbserver-nb\") pod \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.264758 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-dns-swift-storage-0\") pod \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.264799 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5clt\" (UniqueName: \"kubernetes.io/projected/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-kube-api-access-z5clt\") pod \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\" (UID: \"2ee9d423-632a-4e0e-9a83-8746a1c3e13b\") " Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.271669 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-kube-api-access-z5clt" (OuterVolumeSpecName: "kube-api-access-z5clt") pod "2ee9d423-632a-4e0e-9a83-8746a1c3e13b" (UID: "2ee9d423-632a-4e0e-9a83-8746a1c3e13b"). InnerVolumeSpecName "kube-api-access-z5clt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.301219 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2ee9d423-632a-4e0e-9a83-8746a1c3e13b" (UID: "2ee9d423-632a-4e0e-9a83-8746a1c3e13b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.311801 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2ee9d423-632a-4e0e-9a83-8746a1c3e13b" (UID: "2ee9d423-632a-4e0e-9a83-8746a1c3e13b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.316430 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ee9d423-632a-4e0e-9a83-8746a1c3e13b" (UID: "2ee9d423-632a-4e0e-9a83-8746a1c3e13b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.333385 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-config" (OuterVolumeSpecName: "config") pod "2ee9d423-632a-4e0e-9a83-8746a1c3e13b" (UID: "2ee9d423-632a-4e0e-9a83-8746a1c3e13b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.352375 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ee9d423-632a-4e0e-9a83-8746a1c3e13b" (UID: "2ee9d423-632a-4e0e-9a83-8746a1c3e13b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.368099 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.368142 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.368155 4745 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.368163 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.368173 4745 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.368186 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5clt\" (UniqueName: \"kubernetes.io/projected/2ee9d423-632a-4e0e-9a83-8746a1c3e13b-kube-api-access-z5clt\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.392332 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b4f5fc4f-2c4hw"] Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.462560 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:54:26 crc kubenswrapper[4745]: W1209 11:54:26.466059 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76672101_82d9_4d42_b793_6fa33ce5c91a.slice/crio-9aa60e749f6081103894a8da50a852dcf0e5b867eb9d3d6c1623cf71390549f9 WatchSource:0}: Error finding container 9aa60e749f6081103894a8da50a852dcf0e5b867eb9d3d6c1623cf71390549f9: Status 404 returned error can't find the container with id 9aa60e749f6081103894a8da50a852dcf0e5b867eb9d3d6c1623cf71390549f9 Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.822678 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" event={"ID":"ad70250d-1392-4c1f-b661-09709cb9d7b0","Type":"ContainerStarted","Data":"804aa3fc361d35883323f0a41d9e0ac6467b291cd2b36c6cc5de42e1b4c78913"} Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.856293 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7aed9859-5b8c-4a55-9b4d-b9264e59d114","Type":"ContainerStarted","Data":"50166daa39c355162f82402f1eacc532f844a4f332c0bbe9dcc9a2fb47099d3c"} Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.871284 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6765469b68-lmphn" event={"ID":"bd180b10-c922-4741-bf65-0a7c220e980d","Type":"ContainerStarted","Data":"69d7836fd32944214e2746269b1e246a91a93c3633eaa6a5f97b70b649aebf64"} Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.895855 4745 generic.go:334] "Generic (PLEG): container finished" podID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerID="fd7788b125194e4e4a263fa965e205af8f8904f04815be5bca118a21310532f7" exitCode=0 Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.895889 4745 generic.go:334] "Generic (PLEG): container finished" podID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerID="8348c879ee07ca3f7d84fb1ffb00d578ef657f44ba95d488301a8bbc08a8dd73" exitCode=2 Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.895900 4745 generic.go:334] "Generic (PLEG): container finished" podID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerID="98404fc4ea5196498e38574d8cff7c571312490bfef5bcd9d09ba1fd7edd741a" exitCode=0 Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.895956 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f","Type":"ContainerDied","Data":"fd7788b125194e4e4a263fa965e205af8f8904f04815be5bca118a21310532f7"} Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.895991 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f","Type":"ContainerDied","Data":"8348c879ee07ca3f7d84fb1ffb00d578ef657f44ba95d488301a8bbc08a8dd73"} Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.896005 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f","Type":"ContainerDied","Data":"98404fc4ea5196498e38574d8cff7c571312490bfef5bcd9d09ba1fd7edd741a"} Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.904874 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3"} Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.914433 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"76672101-82d9-4d42-b793-6fa33ce5c91a","Type":"ContainerStarted","Data":"9aa60e749f6081103894a8da50a852dcf0e5b867eb9d3d6c1623cf71390549f9"} Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.920308 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6765469b68-lmphn" podStartSLOduration=3.920287547 podStartE2EDuration="3.920287547s" podCreationTimestamp="2025-12-09 11:54:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:54:26.91297132 +0000 UTC m=+1353.738172854" watchObservedRunningTime="2025-12-09 11:54:26.920287547 +0000 UTC m=+1353.745489071" Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.932022 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" event={"ID":"2ee9d423-632a-4e0e-9a83-8746a1c3e13b","Type":"ContainerDied","Data":"2193b76aaeab7d8e277cfcdd236a657a121888a80eb0fa59531ab10583211e82"} Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.932256 4745 scope.go:117] "RemoveContainer" containerID="07d8f259aab1aac7ed324dba7bac6dd94fe2b2054a20a3ec1f2ff3ae2c2ac31e" Dec 09 11:54:26 crc kubenswrapper[4745]: I1209 11:54:26.932445 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c4dfcffc-zcgsf" Dec 09 11:54:27 crc kubenswrapper[4745]: I1209 11:54:27.038585 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54c4dfcffc-zcgsf"] Dec 09 11:54:27 crc kubenswrapper[4745]: I1209 11:54:27.048619 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54c4dfcffc-zcgsf"] Dec 09 11:54:27 crc kubenswrapper[4745]: I1209 11:54:27.567230 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ee9d423-632a-4e0e-9a83-8746a1c3e13b" path="/var/lib/kubelet/pods/2ee9d423-632a-4e0e-9a83-8746a1c3e13b/volumes" Dec 09 11:54:27 crc kubenswrapper[4745]: I1209 11:54:27.946153 4745 generic.go:334] "Generic (PLEG): container finished" podID="ad70250d-1392-4c1f-b661-09709cb9d7b0" containerID="2dace4bf90f864726e9d6d208dc0ce8d2d17d003c57ae4ae3439365dc03ebce2" exitCode=0 Dec 09 11:54:27 crc kubenswrapper[4745]: I1209 11:54:27.946573 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" event={"ID":"ad70250d-1392-4c1f-b661-09709cb9d7b0","Type":"ContainerDied","Data":"2dace4bf90f864726e9d6d208dc0ce8d2d17d003c57ae4ae3439365dc03ebce2"} Dec 09 11:54:27 crc kubenswrapper[4745]: I1209 11:54:27.956035 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"76672101-82d9-4d42-b793-6fa33ce5c91a","Type":"ContainerStarted","Data":"15f9d17c8d3ff78dd938d1ea15bc17058234ca145574852e5801d6552db2b1ca"} Dec 09 11:54:27 crc kubenswrapper[4745]: I1209 11:54:27.956328 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:27 crc kubenswrapper[4745]: I1209 11:54:27.956359 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:28 crc kubenswrapper[4745]: I1209 11:54:28.728106 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:54:29 crc kubenswrapper[4745]: I1209 11:54:29.857761 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-57876487f8-zgj8m"] Dec 09 11:54:29 crc kubenswrapper[4745]: E1209 11:54:29.862199 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee9d423-632a-4e0e-9a83-8746a1c3e13b" containerName="init" Dec 09 11:54:29 crc kubenswrapper[4745]: I1209 11:54:29.862351 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee9d423-632a-4e0e-9a83-8746a1c3e13b" containerName="init" Dec 09 11:54:29 crc kubenswrapper[4745]: I1209 11:54:29.862747 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee9d423-632a-4e0e-9a83-8746a1c3e13b" containerName="init" Dec 09 11:54:29 crc kubenswrapper[4745]: I1209 11:54:29.864093 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:29 crc kubenswrapper[4745]: I1209 11:54:29.867593 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 09 11:54:29 crc kubenswrapper[4745]: I1209 11:54:29.867971 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 09 11:54:29 crc kubenswrapper[4745]: I1209 11:54:29.876719 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57876487f8-zgj8m"] Dec 09 11:54:29 crc kubenswrapper[4745]: I1209 11:54:29.966865 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-config-data\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:29 crc kubenswrapper[4745]: I1209 11:54:29.966914 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-public-tls-certs\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:29 crc kubenswrapper[4745]: I1209 11:54:29.966938 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-internal-tls-certs\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:29 crc kubenswrapper[4745]: I1209 11:54:29.966989 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/600c553e-f8e5-4ec6-94e7-2981abc748cb-logs\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:29 crc kubenswrapper[4745]: I1209 11:54:29.967010 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-config-data-custom\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:29 crc kubenswrapper[4745]: I1209 11:54:29.967075 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-combined-ca-bundle\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:29 crc kubenswrapper[4745]: I1209 11:54:29.967103 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g76qp\" (UniqueName: \"kubernetes.io/projected/600c553e-f8e5-4ec6-94e7-2981abc748cb-kube-api-access-g76qp\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.069373 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/600c553e-f8e5-4ec6-94e7-2981abc748cb-logs\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.069447 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-config-data-custom\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.069576 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-combined-ca-bundle\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.069612 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g76qp\" (UniqueName: \"kubernetes.io/projected/600c553e-f8e5-4ec6-94e7-2981abc748cb-kube-api-access-g76qp\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.069698 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-config-data\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.069719 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-public-tls-certs\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.069745 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-internal-tls-certs\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.069876 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/600c553e-f8e5-4ec6-94e7-2981abc748cb-logs\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.078156 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-config-data-custom\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.086728 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-config-data\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.089608 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-combined-ca-bundle\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.089690 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-internal-tls-certs\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.090691 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-public-tls-certs\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.091797 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g76qp\" (UniqueName: \"kubernetes.io/projected/600c553e-f8e5-4ec6-94e7-2981abc748cb-kube-api-access-g76qp\") pod \"barbican-api-57876487f8-zgj8m\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.187280 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.654738 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57876487f8-zgj8m"] Dec 09 11:54:30 crc kubenswrapper[4745]: I1209 11:54:30.994343 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" event={"ID":"4658eac8-46b0-448b-8bc7-7c783fcef1c6","Type":"ContainerStarted","Data":"8c3a126676cf77a7477f9d5072236397973a255f712333be799d5fe817503fba"} Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.000903 4745 generic.go:334] "Generic (PLEG): container finished" podID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerID="ea10af9989f367cd48d7ef0a3dcd9b223668ea66498a7029b126d22456a6913b" exitCode=0 Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.000957 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f","Type":"ContainerDied","Data":"ea10af9989f367cd48d7ef0a3dcd9b223668ea66498a7029b126d22456a6913b"} Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.003452 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"76672101-82d9-4d42-b793-6fa33ce5c91a","Type":"ContainerStarted","Data":"a2a411d25b847b9ee2ed11822ecd1b85da7d9379b7d75783aba79718fee4c56a"} Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.003633 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="76672101-82d9-4d42-b793-6fa33ce5c91a" containerName="cinder-api-log" containerID="cri-o://15f9d17c8d3ff78dd938d1ea15bc17058234ca145574852e5801d6552db2b1ca" gracePeriod=30 Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.003738 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.004147 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="76672101-82d9-4d42-b793-6fa33ce5c91a" containerName="cinder-api" containerID="cri-o://a2a411d25b847b9ee2ed11822ecd1b85da7d9379b7d75783aba79718fee4c56a" gracePeriod=30 Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.018917 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" event={"ID":"ad70250d-1392-4c1f-b661-09709cb9d7b0","Type":"ContainerStarted","Data":"32df404b009804b0eefbb14355816df8c172aa6a4f3a9a44a75807934619ac4a"} Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.020814 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.045938 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7aed9859-5b8c-4a55-9b4d-b9264e59d114","Type":"ContainerStarted","Data":"18a6bb2c5cb08196dd91ec526a672b1df1e62a31ed2a06a8b4ebd0936a476b90"} Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.064404 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57876487f8-zgj8m" event={"ID":"600c553e-f8e5-4ec6-94e7-2981abc748cb","Type":"ContainerStarted","Data":"76e07b4d01b40ce6fe2153747a2637a847f6695c48e7781c037a25626d453609"} Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.064455 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57876487f8-zgj8m" event={"ID":"600c553e-f8e5-4ec6-94e7-2981abc748cb","Type":"ContainerStarted","Data":"867f5731f60fe6f590fb1dc5ac8c8b8bca5f8c58708b87e5c7dc1d2c9b28d1d1"} Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.064889 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.064860797 podStartE2EDuration="6.064860797s" podCreationTimestamp="2025-12-09 11:54:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:54:31.021275283 +0000 UTC m=+1357.846476817" watchObservedRunningTime="2025-12-09 11:54:31.064860797 +0000 UTC m=+1357.890062321" Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.072309 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" podStartSLOduration=6.072282307 podStartE2EDuration="6.072282307s" podCreationTimestamp="2025-12-09 11:54:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:54:31.046555044 +0000 UTC m=+1357.871756568" watchObservedRunningTime="2025-12-09 11:54:31.072282307 +0000 UTC m=+1357.897483831" Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.727645 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.913157 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-sg-core-conf-yaml\") pod \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.914089 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-combined-ca-bundle\") pod \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.914273 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ln76\" (UniqueName: \"kubernetes.io/projected/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-kube-api-access-8ln76\") pod \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.915062 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-run-httpd\") pod \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.915417 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" (UID: "acbf1f60-7a6b-473b-8f36-00bc8bfffa3f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.915492 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-config-data\") pod \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.915978 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-scripts\") pod \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.916042 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-log-httpd\") pod \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\" (UID: \"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f\") " Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.917184 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" (UID: "acbf1f60-7a6b-473b-8f36-00bc8bfffa3f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.920456 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-kube-api-access-8ln76" (OuterVolumeSpecName: "kube-api-access-8ln76") pod "acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" (UID: "acbf1f60-7a6b-473b-8f36-00bc8bfffa3f"). InnerVolumeSpecName "kube-api-access-8ln76". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.920482 4745 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.920549 4745 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.924633 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-scripts" (OuterVolumeSpecName: "scripts") pod "acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" (UID: "acbf1f60-7a6b-473b-8f36-00bc8bfffa3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:31 crc kubenswrapper[4745]: I1209 11:54:31.950135 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" (UID: "acbf1f60-7a6b-473b-8f36-00bc8bfffa3f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.020705 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" (UID: "acbf1f60-7a6b-473b-8f36-00bc8bfffa3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.022133 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ln76\" (UniqueName: \"kubernetes.io/projected/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-kube-api-access-8ln76\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.022155 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.022167 4745 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.022175 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.028130 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-config-data" (OuterVolumeSpecName: "config-data") pod "acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" (UID: "acbf1f60-7a6b-473b-8f36-00bc8bfffa3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.092437 4745 generic.go:334] "Generic (PLEG): container finished" podID="76672101-82d9-4d42-b793-6fa33ce5c91a" containerID="15f9d17c8d3ff78dd938d1ea15bc17058234ca145574852e5801d6552db2b1ca" exitCode=143 Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.092539 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"76672101-82d9-4d42-b793-6fa33ce5c91a","Type":"ContainerDied","Data":"15f9d17c8d3ff78dd938d1ea15bc17058234ca145574852e5801d6552db2b1ca"} Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.094383 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7aed9859-5b8c-4a55-9b4d-b9264e59d114","Type":"ContainerStarted","Data":"987e1d3356c7a0f17ed41dec794387ba432ec393d28857626912ba498a0d5ddd"} Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.100779 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57876487f8-zgj8m" event={"ID":"600c553e-f8e5-4ec6-94e7-2981abc748cb","Type":"ContainerStarted","Data":"9b2f931241d0299ec3b05ca17d1893191def91b2abcc6ab8e43b6350e1923813"} Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.100971 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.102719 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" event={"ID":"4658eac8-46b0-448b-8bc7-7c783fcef1c6","Type":"ContainerStarted","Data":"84c6b918c9652ae6342e8c33b38c40a7337bb5caf0afd25ca43e70b2c29d044c"} Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.106039 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acbf1f60-7a6b-473b-8f36-00bc8bfffa3f","Type":"ContainerDied","Data":"c08a53bfb57599cc3decea29fe159b55c3b631086570e4eb8258ef5955ebfbea"} Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.106080 4745 scope.go:117] "RemoveContainer" containerID="fd7788b125194e4e4a263fa965e205af8f8904f04815be5bca118a21310532f7" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.106193 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.118330 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.097863867 podStartE2EDuration="8.118306363s" podCreationTimestamp="2025-12-09 11:54:24 +0000 UTC" firstStartedPulling="2025-12-09 11:54:25.880971543 +0000 UTC m=+1352.706173067" lastFinishedPulling="2025-12-09 11:54:26.901414039 +0000 UTC m=+1353.726615563" observedRunningTime="2025-12-09 11:54:32.111479329 +0000 UTC m=+1358.936680863" watchObservedRunningTime="2025-12-09 11:54:32.118306363 +0000 UTC m=+1358.943507887" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.120451 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cf679bcc-g65zr" event={"ID":"da94483d-f361-42ef-95b4-d4b2c79b4d80","Type":"ContainerStarted","Data":"09651a3667881d281a4da0af523c62fd762af27f395c545a4af9df337fa048bd"} Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.120502 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cf679bcc-g65zr" event={"ID":"da94483d-f361-42ef-95b4-d4b2c79b4d80","Type":"ContainerStarted","Data":"247fcd5449ca5281d6d3e9c9cde028019450ce328caca6d4d39e7b414171ced2"} Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.131707 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.149229 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" podStartSLOduration=6.098474749 podStartE2EDuration="9.149200745s" podCreationTimestamp="2025-12-09 11:54:23 +0000 UTC" firstStartedPulling="2025-12-09 11:54:25.175179598 +0000 UTC m=+1352.000381122" lastFinishedPulling="2025-12-09 11:54:28.225905594 +0000 UTC m=+1355.051107118" observedRunningTime="2025-12-09 11:54:32.135909167 +0000 UTC m=+1358.961110701" watchObservedRunningTime="2025-12-09 11:54:32.149200745 +0000 UTC m=+1358.974402279" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.194359 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7cf679bcc-g65zr" podStartSLOduration=3.5300246189999998 podStartE2EDuration="9.19433333s" podCreationTimestamp="2025-12-09 11:54:23 +0000 UTC" firstStartedPulling="2025-12-09 11:54:25.156544606 +0000 UTC m=+1351.981746130" lastFinishedPulling="2025-12-09 11:54:30.820853307 +0000 UTC m=+1357.646054841" observedRunningTime="2025-12-09 11:54:32.191959746 +0000 UTC m=+1359.017161280" watchObservedRunningTime="2025-12-09 11:54:32.19433333 +0000 UTC m=+1359.019534854" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.196677 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-57876487f8-zgj8m" podStartSLOduration=3.196671293 podStartE2EDuration="3.196671293s" podCreationTimestamp="2025-12-09 11:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:54:32.167333323 +0000 UTC m=+1358.992534877" watchObservedRunningTime="2025-12-09 11:54:32.196671293 +0000 UTC m=+1359.021872807" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.222739 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.246064 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.258184 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:54:32 crc kubenswrapper[4745]: E1209 11:54:32.258778 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerName="ceilometer-central-agent" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.258798 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerName="ceilometer-central-agent" Dec 09 11:54:32 crc kubenswrapper[4745]: E1209 11:54:32.258814 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerName="sg-core" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.258822 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerName="sg-core" Dec 09 11:54:32 crc kubenswrapper[4745]: E1209 11:54:32.258851 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerName="proxy-httpd" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.258858 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerName="proxy-httpd" Dec 09 11:54:32 crc kubenswrapper[4745]: E1209 11:54:32.258881 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerName="ceilometer-notification-agent" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.258887 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerName="ceilometer-notification-agent" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.259084 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerName="proxy-httpd" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.259117 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerName="ceilometer-notification-agent" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.259130 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerName="sg-core" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.259143 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" containerName="ceilometer-central-agent" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.261075 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.263903 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.264486 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.271666 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.437241 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04ca535f-9ded-4cd8-bdd7-024d124dc87a-run-httpd\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.438139 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-config-data\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.438179 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvlh5\" (UniqueName: \"kubernetes.io/projected/04ca535f-9ded-4cd8-bdd7-024d124dc87a-kube-api-access-gvlh5\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.438234 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.438835 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04ca535f-9ded-4cd8-bdd7-024d124dc87a-log-httpd\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.439005 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.439127 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-scripts\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.507989 4745 scope.go:117] "RemoveContainer" containerID="8348c879ee07ca3f7d84fb1ffb00d578ef657f44ba95d488301a8bbc08a8dd73" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.540816 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-scripts\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.540960 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04ca535f-9ded-4cd8-bdd7-024d124dc87a-run-httpd\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.540981 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-config-data\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.541008 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvlh5\" (UniqueName: \"kubernetes.io/projected/04ca535f-9ded-4cd8-bdd7-024d124dc87a-kube-api-access-gvlh5\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.541054 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.541076 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04ca535f-9ded-4cd8-bdd7-024d124dc87a-log-httpd\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.541101 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.550832 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04ca535f-9ded-4cd8-bdd7-024d124dc87a-log-httpd\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.550899 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04ca535f-9ded-4cd8-bdd7-024d124dc87a-run-httpd\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.555928 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.559198 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-scripts\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.559572 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-config-data\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.563217 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.573649 4745 scope.go:117] "RemoveContainer" containerID="ea10af9989f367cd48d7ef0a3dcd9b223668ea66498a7029b126d22456a6913b" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.575183 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvlh5\" (UniqueName: \"kubernetes.io/projected/04ca535f-9ded-4cd8-bdd7-024d124dc87a-kube-api-access-gvlh5\") pod \"ceilometer-0\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.580991 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:54:32 crc kubenswrapper[4745]: I1209 11:54:32.726540 4745 scope.go:117] "RemoveContainer" containerID="98404fc4ea5196498e38574d8cff7c571312490bfef5bcd9d09ba1fd7edd741a" Dec 09 11:54:33 crc kubenswrapper[4745]: I1209 11:54:33.133643 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:54:33 crc kubenswrapper[4745]: W1209 11:54:33.144720 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ca535f_9ded_4cd8_bdd7_024d124dc87a.slice/crio-202e5cdf65cfc511eb57dba2b2a8d5b4c85ee50e8ee2b614dade06dccaf1ed64 WatchSource:0}: Error finding container 202e5cdf65cfc511eb57dba2b2a8d5b4c85ee50e8ee2b614dade06dccaf1ed64: Status 404 returned error can't find the container with id 202e5cdf65cfc511eb57dba2b2a8d5b4c85ee50e8ee2b614dade06dccaf1ed64 Dec 09 11:54:33 crc kubenswrapper[4745]: I1209 11:54:33.147320 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:33 crc kubenswrapper[4745]: I1209 11:54:33.566972 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acbf1f60-7a6b-473b-8f36-00bc8bfffa3f" path="/var/lib/kubelet/pods/acbf1f60-7a6b-473b-8f36-00bc8bfffa3f/volumes" Dec 09 11:54:34 crc kubenswrapper[4745]: I1209 11:54:34.159872 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"04ca535f-9ded-4cd8-bdd7-024d124dc87a","Type":"ContainerStarted","Data":"c8be65f5b1f5caae49f2e5b50c120cdc59cc18d82458533bb9e001ff091126c4"} Dec 09 11:54:34 crc kubenswrapper[4745]: I1209 11:54:34.159940 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"04ca535f-9ded-4cd8-bdd7-024d124dc87a","Type":"ContainerStarted","Data":"202e5cdf65cfc511eb57dba2b2a8d5b4c85ee50e8ee2b614dade06dccaf1ed64"} Dec 09 11:54:35 crc kubenswrapper[4745]: I1209 11:54:35.176245 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"04ca535f-9ded-4cd8-bdd7-024d124dc87a","Type":"ContainerStarted","Data":"4db7c6576d85acf8c4612b1c4a6a42c810d3ce10bc0382ac5b890b6fb7e8eb45"} Dec 09 11:54:35 crc kubenswrapper[4745]: I1209 11:54:35.309725 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 09 11:54:35 crc kubenswrapper[4745]: I1209 11:54:35.591394 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 09 11:54:35 crc kubenswrapper[4745]: I1209 11:54:35.688789 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:54:35 crc kubenswrapper[4745]: I1209 11:54:35.721031 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:35 crc kubenswrapper[4745]: I1209 11:54:35.761434 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bb67c87c9-9jxz8"] Dec 09 11:54:35 crc kubenswrapper[4745]: I1209 11:54:35.761777 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" podUID="d02ef196-0d33-4546-8670-bd7dbb54e9b1" containerName="dnsmasq-dns" containerID="cri-o://c99f3570c494823f8b0bfc38c69a7643421d26c483cf943e0cedfc105b0577d1" gracePeriod=10 Dec 09 11:54:35 crc kubenswrapper[4745]: I1209 11:54:35.815443 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.118302 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.213951 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"04ca535f-9ded-4cd8-bdd7-024d124dc87a","Type":"ContainerStarted","Data":"0b837e94af99bb7ed0cf1b3cc865496d2bc79d7b765b81e6f10d82d5d87a798e"} Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.230622 4745 generic.go:334] "Generic (PLEG): container finished" podID="d02ef196-0d33-4546-8670-bd7dbb54e9b1" containerID="c99f3570c494823f8b0bfc38c69a7643421d26c483cf943e0cedfc105b0577d1" exitCode=0 Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.231606 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" event={"ID":"d02ef196-0d33-4546-8670-bd7dbb54e9b1","Type":"ContainerDied","Data":"c99f3570c494823f8b0bfc38c69a7643421d26c483cf943e0cedfc105b0577d1"} Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.329241 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.485919 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.640688 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-dns-swift-storage-0\") pod \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.640799 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-ovsdbserver-nb\") pod \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.640856 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-config\") pod \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.640947 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb2d8\" (UniqueName: \"kubernetes.io/projected/d02ef196-0d33-4546-8670-bd7dbb54e9b1-kube-api-access-kb2d8\") pod \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.641043 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-ovsdbserver-sb\") pod \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.641166 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-dns-svc\") pod \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\" (UID: \"d02ef196-0d33-4546-8670-bd7dbb54e9b1\") " Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.681451 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02ef196-0d33-4546-8670-bd7dbb54e9b1-kube-api-access-kb2d8" (OuterVolumeSpecName: "kube-api-access-kb2d8") pod "d02ef196-0d33-4546-8670-bd7dbb54e9b1" (UID: "d02ef196-0d33-4546-8670-bd7dbb54e9b1"). InnerVolumeSpecName "kube-api-access-kb2d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.742444 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-config" (OuterVolumeSpecName: "config") pod "d02ef196-0d33-4546-8670-bd7dbb54e9b1" (UID: "d02ef196-0d33-4546-8670-bd7dbb54e9b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.759647 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.759703 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb2d8\" (UniqueName: \"kubernetes.io/projected/d02ef196-0d33-4546-8670-bd7dbb54e9b1-kube-api-access-kb2d8\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.814207 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d02ef196-0d33-4546-8670-bd7dbb54e9b1" (UID: "d02ef196-0d33-4546-8670-bd7dbb54e9b1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.832454 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d02ef196-0d33-4546-8670-bd7dbb54e9b1" (UID: "d02ef196-0d33-4546-8670-bd7dbb54e9b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.861312 4745 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.861352 4745 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.873125 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d02ef196-0d33-4546-8670-bd7dbb54e9b1" (UID: "d02ef196-0d33-4546-8670-bd7dbb54e9b1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.874545 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d02ef196-0d33-4546-8670-bd7dbb54e9b1" (UID: "d02ef196-0d33-4546-8670-bd7dbb54e9b1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.963487 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:36 crc kubenswrapper[4745]: I1209 11:54:36.963783 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d02ef196-0d33-4546-8670-bd7dbb54e9b1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:37 crc kubenswrapper[4745]: I1209 11:54:37.246340 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"04ca535f-9ded-4cd8-bdd7-024d124dc87a","Type":"ContainerStarted","Data":"01643454eadbd303ceec170681409d19446b0d2e929575349a5418d564e8a107"} Dec 09 11:54:37 crc kubenswrapper[4745]: I1209 11:54:37.246445 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 11:54:37 crc kubenswrapper[4745]: I1209 11:54:37.249931 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" Dec 09 11:54:37 crc kubenswrapper[4745]: I1209 11:54:37.249961 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb67c87c9-9jxz8" event={"ID":"d02ef196-0d33-4546-8670-bd7dbb54e9b1","Type":"ContainerDied","Data":"46eee2b3fbab85957218daf1c2aabe4bfefe38aaae84b0c5c39ac483db6b955f"} Dec 09 11:54:37 crc kubenswrapper[4745]: I1209 11:54:37.250083 4745 scope.go:117] "RemoveContainer" containerID="c99f3570c494823f8b0bfc38c69a7643421d26c483cf943e0cedfc105b0577d1" Dec 09 11:54:37 crc kubenswrapper[4745]: I1209 11:54:37.250085 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7aed9859-5b8c-4a55-9b4d-b9264e59d114" containerName="cinder-scheduler" containerID="cri-o://18a6bb2c5cb08196dd91ec526a672b1df1e62a31ed2a06a8b4ebd0936a476b90" gracePeriod=30 Dec 09 11:54:37 crc kubenswrapper[4745]: I1209 11:54:37.250186 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7aed9859-5b8c-4a55-9b4d-b9264e59d114" containerName="probe" containerID="cri-o://987e1d3356c7a0f17ed41dec794387ba432ec393d28857626912ba498a0d5ddd" gracePeriod=30 Dec 09 11:54:37 crc kubenswrapper[4745]: I1209 11:54:37.289345 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.830960519 podStartE2EDuration="5.28932299s" podCreationTimestamp="2025-12-09 11:54:32 +0000 UTC" firstStartedPulling="2025-12-09 11:54:33.147089695 +0000 UTC m=+1359.972291219" lastFinishedPulling="2025-12-09 11:54:36.605452166 +0000 UTC m=+1363.430653690" observedRunningTime="2025-12-09 11:54:37.277920663 +0000 UTC m=+1364.103122187" watchObservedRunningTime="2025-12-09 11:54:37.28932299 +0000 UTC m=+1364.114524514" Dec 09 11:54:37 crc kubenswrapper[4745]: I1209 11:54:37.297007 4745 scope.go:117] "RemoveContainer" containerID="9479b00344fe8ffbe4e7f32232a1f6cd2146c5dbcb30a375a14d294380268267" Dec 09 11:54:37 crc kubenswrapper[4745]: I1209 11:54:37.309071 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bb67c87c9-9jxz8"] Dec 09 11:54:37 crc kubenswrapper[4745]: I1209 11:54:37.328881 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bb67c87c9-9jxz8"] Dec 09 11:54:37 crc kubenswrapper[4745]: I1209 11:54:37.565305 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d02ef196-0d33-4546-8670-bd7dbb54e9b1" path="/var/lib/kubelet/pods/d02ef196-0d33-4546-8670-bd7dbb54e9b1/volumes" Dec 09 11:54:37 crc kubenswrapper[4745]: I1209 11:54:37.684126 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:38 crc kubenswrapper[4745]: I1209 11:54:38.262456 4745 generic.go:334] "Generic (PLEG): container finished" podID="7aed9859-5b8c-4a55-9b4d-b9264e59d114" containerID="987e1d3356c7a0f17ed41dec794387ba432ec393d28857626912ba498a0d5ddd" exitCode=0 Dec 09 11:54:38 crc kubenswrapper[4745]: I1209 11:54:38.262535 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7aed9859-5b8c-4a55-9b4d-b9264e59d114","Type":"ContainerDied","Data":"987e1d3356c7a0f17ed41dec794387ba432ec393d28857626912ba498a0d5ddd"} Dec 09 11:54:38 crc kubenswrapper[4745]: I1209 11:54:38.373974 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:38 crc kubenswrapper[4745]: I1209 11:54:38.538203 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:54:38 crc kubenswrapper[4745]: I1209 11:54:38.711967 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.150809 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.275952 4745 generic.go:334] "Generic (PLEG): container finished" podID="7aed9859-5b8c-4a55-9b4d-b9264e59d114" containerID="18a6bb2c5cb08196dd91ec526a672b1df1e62a31ed2a06a8b4ebd0936a476b90" exitCode=0 Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.276030 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7aed9859-5b8c-4a55-9b4d-b9264e59d114","Type":"ContainerDied","Data":"18a6bb2c5cb08196dd91ec526a672b1df1e62a31ed2a06a8b4ebd0936a476b90"} Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.276071 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.276084 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7aed9859-5b8c-4a55-9b4d-b9264e59d114","Type":"ContainerDied","Data":"50166daa39c355162f82402f1eacc532f844a4f332c0bbe9dcc9a2fb47099d3c"} Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.276101 4745 scope.go:117] "RemoveContainer" containerID="987e1d3356c7a0f17ed41dec794387ba432ec393d28857626912ba498a0d5ddd" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.313283 4745 scope.go:117] "RemoveContainer" containerID="18a6bb2c5cb08196dd91ec526a672b1df1e62a31ed2a06a8b4ebd0936a476b90" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.321158 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7aed9859-5b8c-4a55-9b4d-b9264e59d114-etc-machine-id\") pod \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.321321 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-combined-ca-bundle\") pod \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.321393 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7aed9859-5b8c-4a55-9b4d-b9264e59d114-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7aed9859-5b8c-4a55-9b4d-b9264e59d114" (UID: "7aed9859-5b8c-4a55-9b4d-b9264e59d114"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.322360 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfcjb\" (UniqueName: \"kubernetes.io/projected/7aed9859-5b8c-4a55-9b4d-b9264e59d114-kube-api-access-hfcjb\") pod \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.322433 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-scripts\") pod \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.322496 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-config-data\") pod \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.322582 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-config-data-custom\") pod \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\" (UID: \"7aed9859-5b8c-4a55-9b4d-b9264e59d114\") " Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.323392 4745 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7aed9859-5b8c-4a55-9b4d-b9264e59d114-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.332103 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aed9859-5b8c-4a55-9b4d-b9264e59d114-kube-api-access-hfcjb" (OuterVolumeSpecName: "kube-api-access-hfcjb") pod "7aed9859-5b8c-4a55-9b4d-b9264e59d114" (UID: "7aed9859-5b8c-4a55-9b4d-b9264e59d114"). InnerVolumeSpecName "kube-api-access-hfcjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.335213 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7aed9859-5b8c-4a55-9b4d-b9264e59d114" (UID: "7aed9859-5b8c-4a55-9b4d-b9264e59d114"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.335625 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-scripts" (OuterVolumeSpecName: "scripts") pod "7aed9859-5b8c-4a55-9b4d-b9264e59d114" (UID: "7aed9859-5b8c-4a55-9b4d-b9264e59d114"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.371094 4745 scope.go:117] "RemoveContainer" containerID="987e1d3356c7a0f17ed41dec794387ba432ec393d28857626912ba498a0d5ddd" Dec 09 11:54:39 crc kubenswrapper[4745]: E1209 11:54:39.373103 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"987e1d3356c7a0f17ed41dec794387ba432ec393d28857626912ba498a0d5ddd\": container with ID starting with 987e1d3356c7a0f17ed41dec794387ba432ec393d28857626912ba498a0d5ddd not found: ID does not exist" containerID="987e1d3356c7a0f17ed41dec794387ba432ec393d28857626912ba498a0d5ddd" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.373212 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"987e1d3356c7a0f17ed41dec794387ba432ec393d28857626912ba498a0d5ddd"} err="failed to get container status \"987e1d3356c7a0f17ed41dec794387ba432ec393d28857626912ba498a0d5ddd\": rpc error: code = NotFound desc = could not find container \"987e1d3356c7a0f17ed41dec794387ba432ec393d28857626912ba498a0d5ddd\": container with ID starting with 987e1d3356c7a0f17ed41dec794387ba432ec393d28857626912ba498a0d5ddd not found: ID does not exist" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.373298 4745 scope.go:117] "RemoveContainer" containerID="18a6bb2c5cb08196dd91ec526a672b1df1e62a31ed2a06a8b4ebd0936a476b90" Dec 09 11:54:39 crc kubenswrapper[4745]: E1209 11:54:39.374867 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a6bb2c5cb08196dd91ec526a672b1df1e62a31ed2a06a8b4ebd0936a476b90\": container with ID starting with 18a6bb2c5cb08196dd91ec526a672b1df1e62a31ed2a06a8b4ebd0936a476b90 not found: ID does not exist" containerID="18a6bb2c5cb08196dd91ec526a672b1df1e62a31ed2a06a8b4ebd0936a476b90" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.375064 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a6bb2c5cb08196dd91ec526a672b1df1e62a31ed2a06a8b4ebd0936a476b90"} err="failed to get container status \"18a6bb2c5cb08196dd91ec526a672b1df1e62a31ed2a06a8b4ebd0936a476b90\": rpc error: code = NotFound desc = could not find container \"18a6bb2c5cb08196dd91ec526a672b1df1e62a31ed2a06a8b4ebd0936a476b90\": container with ID starting with 18a6bb2c5cb08196dd91ec526a672b1df1e62a31ed2a06a8b4ebd0936a476b90 not found: ID does not exist" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.394675 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7aed9859-5b8c-4a55-9b4d-b9264e59d114" (UID: "7aed9859-5b8c-4a55-9b4d-b9264e59d114"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.426562 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.426624 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfcjb\" (UniqueName: \"kubernetes.io/projected/7aed9859-5b8c-4a55-9b4d-b9264e59d114-kube-api-access-hfcjb\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.426636 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.426644 4745 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.450677 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.458614 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-config-data" (OuterVolumeSpecName: "config-data") pod "7aed9859-5b8c-4a55-9b4d-b9264e59d114" (UID: "7aed9859-5b8c-4a55-9b4d-b9264e59d114"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.529360 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aed9859-5b8c-4a55-9b4d-b9264e59d114-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.535480 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b5f4bd8c8-nsssp"] Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.535794 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b5f4bd8c8-nsssp" podUID="a956face-d831-4e78-b6f5-36fd5dd17d87" containerName="neutron-api" containerID="cri-o://3cdceb99a9eec1767150325fcaa34295a923cc03b478f8870daf679e47f911c4" gracePeriod=30 Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.535920 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b5f4bd8c8-nsssp" podUID="a956face-d831-4e78-b6f5-36fd5dd17d87" containerName="neutron-httpd" containerID="cri-o://0f03b2380ccee137ab05d37ab9697bd0cbf1866ed1b4e0b065bc8089e1995aa4" gracePeriod=30 Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.622706 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.630342 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.681903 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:54:39 crc kubenswrapper[4745]: E1209 11:54:39.682558 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02ef196-0d33-4546-8670-bd7dbb54e9b1" containerName="dnsmasq-dns" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.682579 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02ef196-0d33-4546-8670-bd7dbb54e9b1" containerName="dnsmasq-dns" Dec 09 11:54:39 crc kubenswrapper[4745]: E1209 11:54:39.682610 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02ef196-0d33-4546-8670-bd7dbb54e9b1" containerName="init" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.682618 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02ef196-0d33-4546-8670-bd7dbb54e9b1" containerName="init" Dec 09 11:54:39 crc kubenswrapper[4745]: E1209 11:54:39.682655 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aed9859-5b8c-4a55-9b4d-b9264e59d114" containerName="cinder-scheduler" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.682676 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aed9859-5b8c-4a55-9b4d-b9264e59d114" containerName="cinder-scheduler" Dec 09 11:54:39 crc kubenswrapper[4745]: E1209 11:54:39.682687 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aed9859-5b8c-4a55-9b4d-b9264e59d114" containerName="probe" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.682693 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aed9859-5b8c-4a55-9b4d-b9264e59d114" containerName="probe" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.682934 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aed9859-5b8c-4a55-9b4d-b9264e59d114" containerName="cinder-scheduler" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.682958 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aed9859-5b8c-4a55-9b4d-b9264e59d114" containerName="probe" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.682969 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02ef196-0d33-4546-8670-bd7dbb54e9b1" containerName="dnsmasq-dns" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.684301 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.692612 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.698046 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.835673 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.835983 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-scripts\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.836036 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.836059 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-config-data\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.836082 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kvzz\" (UniqueName: \"kubernetes.io/projected/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-kube-api-access-4kvzz\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.836148 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.937618 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.938578 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-config-data\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.938636 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kvzz\" (UniqueName: \"kubernetes.io/projected/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-kube-api-access-4kvzz\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.938812 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.938910 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.938987 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-scripts\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.939194 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.942503 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.943139 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.944150 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-config-data\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.948022 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-scripts\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:39 crc kubenswrapper[4745]: I1209 11:54:39.964096 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kvzz\" (UniqueName: \"kubernetes.io/projected/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-kube-api-access-4kvzz\") pod \"cinder-scheduler-0\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " pod="openstack/cinder-scheduler-0" Dec 09 11:54:40 crc kubenswrapper[4745]: I1209 11:54:40.093936 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 11:54:40 crc kubenswrapper[4745]: I1209 11:54:40.194771 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:54:40 crc kubenswrapper[4745]: I1209 11:54:40.288916 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6765469b68-lmphn"] Dec 09 11:54:40 crc kubenswrapper[4745]: I1209 11:54:40.293868 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6765469b68-lmphn" podUID="bd180b10-c922-4741-bf65-0a7c220e980d" containerName="barbican-api-log" containerID="cri-o://9907a5e0df1ef3b89d42441ae0bbd28fb681911524522f3e45d9dfedc04a6905" gracePeriod=30 Dec 09 11:54:40 crc kubenswrapper[4745]: I1209 11:54:40.294492 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6765469b68-lmphn" podUID="bd180b10-c922-4741-bf65-0a7c220e980d" containerName="barbican-api" containerID="cri-o://69d7836fd32944214e2746269b1e246a91a93c3633eaa6a5f97b70b649aebf64" gracePeriod=30 Dec 09 11:54:40 crc kubenswrapper[4745]: I1209 11:54:40.311600 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6765469b68-lmphn" podUID="bd180b10-c922-4741-bf65-0a7c220e980d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": EOF" Dec 09 11:54:40 crc kubenswrapper[4745]: I1209 11:54:40.333938 4745 generic.go:334] "Generic (PLEG): container finished" podID="a956face-d831-4e78-b6f5-36fd5dd17d87" containerID="0f03b2380ccee137ab05d37ab9697bd0cbf1866ed1b4e0b065bc8089e1995aa4" exitCode=0 Dec 09 11:54:40 crc kubenswrapper[4745]: I1209 11:54:40.334007 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b5f4bd8c8-nsssp" event={"ID":"a956face-d831-4e78-b6f5-36fd5dd17d87","Type":"ContainerDied","Data":"0f03b2380ccee137ab05d37ab9697bd0cbf1866ed1b4e0b065bc8089e1995aa4"} Dec 09 11:54:40 crc kubenswrapper[4745]: I1209 11:54:40.805231 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:54:40 crc kubenswrapper[4745]: W1209 11:54:40.809536 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaef3c48_5e7c_4ea3_a2d0_da44ea528455.slice/crio-b2075f351e7ae8c27f3ebc434998594b8366224de945e6605e33ad1d8cc3d96d WatchSource:0}: Error finding container b2075f351e7ae8c27f3ebc434998594b8366224de945e6605e33ad1d8cc3d96d: Status 404 returned error can't find the container with id b2075f351e7ae8c27f3ebc434998594b8366224de945e6605e33ad1d8cc3d96d Dec 09 11:54:41 crc kubenswrapper[4745]: I1209 11:54:41.353673 4745 generic.go:334] "Generic (PLEG): container finished" podID="bd180b10-c922-4741-bf65-0a7c220e980d" containerID="9907a5e0df1ef3b89d42441ae0bbd28fb681911524522f3e45d9dfedc04a6905" exitCode=143 Dec 09 11:54:41 crc kubenswrapper[4745]: I1209 11:54:41.354069 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6765469b68-lmphn" event={"ID":"bd180b10-c922-4741-bf65-0a7c220e980d","Type":"ContainerDied","Data":"9907a5e0df1ef3b89d42441ae0bbd28fb681911524522f3e45d9dfedc04a6905"} Dec 09 11:54:41 crc kubenswrapper[4745]: I1209 11:54:41.355873 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"eaef3c48-5e7c-4ea3-a2d0-da44ea528455","Type":"ContainerStarted","Data":"b2075f351e7ae8c27f3ebc434998594b8366224de945e6605e33ad1d8cc3d96d"} Dec 09 11:54:41 crc kubenswrapper[4745]: I1209 11:54:41.578795 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aed9859-5b8c-4a55-9b4d-b9264e59d114" path="/var/lib/kubelet/pods/7aed9859-5b8c-4a55-9b4d-b9264e59d114/volumes" Dec 09 11:54:42 crc kubenswrapper[4745]: I1209 11:54:42.422857 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"eaef3c48-5e7c-4ea3-a2d0-da44ea528455","Type":"ContainerStarted","Data":"85b41109811ea6cff92fc56baafb1484f3284cd5b7dfd75355171bc8db15a025"} Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.249681 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.363295 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-httpd-config\") pod \"a956face-d831-4e78-b6f5-36fd5dd17d87\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.363365 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwdjm\" (UniqueName: \"kubernetes.io/projected/a956face-d831-4e78-b6f5-36fd5dd17d87-kube-api-access-xwdjm\") pod \"a956face-d831-4e78-b6f5-36fd5dd17d87\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.363462 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-ovndb-tls-certs\") pod \"a956face-d831-4e78-b6f5-36fd5dd17d87\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.363593 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-combined-ca-bundle\") pod \"a956face-d831-4e78-b6f5-36fd5dd17d87\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.364636 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-config\") pod \"a956face-d831-4e78-b6f5-36fd5dd17d87\" (UID: \"a956face-d831-4e78-b6f5-36fd5dd17d87\") " Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.370708 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a956face-d831-4e78-b6f5-36fd5dd17d87-kube-api-access-xwdjm" (OuterVolumeSpecName: "kube-api-access-xwdjm") pod "a956face-d831-4e78-b6f5-36fd5dd17d87" (UID: "a956face-d831-4e78-b6f5-36fd5dd17d87"). InnerVolumeSpecName "kube-api-access-xwdjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.389272 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a956face-d831-4e78-b6f5-36fd5dd17d87" (UID: "a956face-d831-4e78-b6f5-36fd5dd17d87"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.415166 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a956face-d831-4e78-b6f5-36fd5dd17d87" (UID: "a956face-d831-4e78-b6f5-36fd5dd17d87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.433356 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-config" (OuterVolumeSpecName: "config") pod "a956face-d831-4e78-b6f5-36fd5dd17d87" (UID: "a956face-d831-4e78-b6f5-36fd5dd17d87"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.440592 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"eaef3c48-5e7c-4ea3-a2d0-da44ea528455","Type":"ContainerStarted","Data":"7a0d9d93ae67ba5d6ed408c67d268af4c670890fd02fafc91373ab63fe2182ca"} Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.447129 4745 generic.go:334] "Generic (PLEG): container finished" podID="a956face-d831-4e78-b6f5-36fd5dd17d87" containerID="3cdceb99a9eec1767150325fcaa34295a923cc03b478f8870daf679e47f911c4" exitCode=0 Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.447204 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b5f4bd8c8-nsssp" event={"ID":"a956face-d831-4e78-b6f5-36fd5dd17d87","Type":"ContainerDied","Data":"3cdceb99a9eec1767150325fcaa34295a923cc03b478f8870daf679e47f911c4"} Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.447248 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b5f4bd8c8-nsssp" event={"ID":"a956face-d831-4e78-b6f5-36fd5dd17d87","Type":"ContainerDied","Data":"ca7f8020f033eb2997746c35bc767adbe0fa8151e1988ff93196c22ca7d1db8d"} Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.447277 4745 scope.go:117] "RemoveContainer" containerID="0f03b2380ccee137ab05d37ab9697bd0cbf1866ed1b4e0b065bc8089e1995aa4" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.447468 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b5f4bd8c8-nsssp" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.470173 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.470207 4745 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.470217 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwdjm\" (UniqueName: \"kubernetes.io/projected/a956face-d831-4e78-b6f5-36fd5dd17d87-kube-api-access-xwdjm\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.470229 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.480718 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a956face-d831-4e78-b6f5-36fd5dd17d87" (UID: "a956face-d831-4e78-b6f5-36fd5dd17d87"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.506223 4745 scope.go:117] "RemoveContainer" containerID="3cdceb99a9eec1767150325fcaa34295a923cc03b478f8870daf679e47f911c4" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.529583 4745 scope.go:117] "RemoveContainer" containerID="0f03b2380ccee137ab05d37ab9697bd0cbf1866ed1b4e0b065bc8089e1995aa4" Dec 09 11:54:43 crc kubenswrapper[4745]: E1209 11:54:43.530554 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f03b2380ccee137ab05d37ab9697bd0cbf1866ed1b4e0b065bc8089e1995aa4\": container with ID starting with 0f03b2380ccee137ab05d37ab9697bd0cbf1866ed1b4e0b065bc8089e1995aa4 not found: ID does not exist" containerID="0f03b2380ccee137ab05d37ab9697bd0cbf1866ed1b4e0b065bc8089e1995aa4" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.530587 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f03b2380ccee137ab05d37ab9697bd0cbf1866ed1b4e0b065bc8089e1995aa4"} err="failed to get container status \"0f03b2380ccee137ab05d37ab9697bd0cbf1866ed1b4e0b065bc8089e1995aa4\": rpc error: code = NotFound desc = could not find container \"0f03b2380ccee137ab05d37ab9697bd0cbf1866ed1b4e0b065bc8089e1995aa4\": container with ID starting with 0f03b2380ccee137ab05d37ab9697bd0cbf1866ed1b4e0b065bc8089e1995aa4 not found: ID does not exist" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.530623 4745 scope.go:117] "RemoveContainer" containerID="3cdceb99a9eec1767150325fcaa34295a923cc03b478f8870daf679e47f911c4" Dec 09 11:54:43 crc kubenswrapper[4745]: E1209 11:54:43.530859 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cdceb99a9eec1767150325fcaa34295a923cc03b478f8870daf679e47f911c4\": container with ID starting with 3cdceb99a9eec1767150325fcaa34295a923cc03b478f8870daf679e47f911c4 not found: ID does not exist" containerID="3cdceb99a9eec1767150325fcaa34295a923cc03b478f8870daf679e47f911c4" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.530902 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cdceb99a9eec1767150325fcaa34295a923cc03b478f8870daf679e47f911c4"} err="failed to get container status \"3cdceb99a9eec1767150325fcaa34295a923cc03b478f8870daf679e47f911c4\": rpc error: code = NotFound desc = could not find container \"3cdceb99a9eec1767150325fcaa34295a923cc03b478f8870daf679e47f911c4\": container with ID starting with 3cdceb99a9eec1767150325fcaa34295a923cc03b478f8870daf679e47f911c4 not found: ID does not exist" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.572980 4745 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a956face-d831-4e78-b6f5-36fd5dd17d87-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.782127 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.782105779 podStartE2EDuration="4.782105779s" podCreationTimestamp="2025-12-09 11:54:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:54:43.470370705 +0000 UTC m=+1370.295572249" watchObservedRunningTime="2025-12-09 11:54:43.782105779 +0000 UTC m=+1370.607307303" Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.786188 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b5f4bd8c8-nsssp"] Dec 09 11:54:43 crc kubenswrapper[4745]: I1209 11:54:43.798076 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b5f4bd8c8-nsssp"] Dec 09 11:54:44 crc kubenswrapper[4745]: I1209 11:54:44.740878 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:54:44 crc kubenswrapper[4745]: I1209 11:54:44.766724 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6765469b68-lmphn" podUID="bd180b10-c922-4741-bf65-0a7c220e980d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:48528->10.217.0.158:9311: read: connection reset by peer" Dec 09 11:54:44 crc kubenswrapper[4745]: I1209 11:54:44.767034 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6765469b68-lmphn" podUID="bd180b10-c922-4741-bf65-0a7c220e980d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:48520->10.217.0.158:9311: read: connection reset by peer" Dec 09 11:54:44 crc kubenswrapper[4745]: E1209 11:54:44.868563 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd180b10_c922_4741_bf65_0a7c220e980d.slice/crio-conmon-69d7836fd32944214e2746269b1e246a91a93c3633eaa6a5f97b70b649aebf64.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd180b10_c922_4741_bf65_0a7c220e980d.slice/crio-69d7836fd32944214e2746269b1e246a91a93c3633eaa6a5f97b70b649aebf64.scope\": RecentStats: unable to find data in memory cache]" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.099390 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.243609 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.413262 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-combined-ca-bundle\") pod \"bd180b10-c922-4741-bf65-0a7c220e980d\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.413346 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68kpc\" (UniqueName: \"kubernetes.io/projected/bd180b10-c922-4741-bf65-0a7c220e980d-kube-api-access-68kpc\") pod \"bd180b10-c922-4741-bf65-0a7c220e980d\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.413437 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-config-data\") pod \"bd180b10-c922-4741-bf65-0a7c220e980d\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.413527 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-config-data-custom\") pod \"bd180b10-c922-4741-bf65-0a7c220e980d\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.413594 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd180b10-c922-4741-bf65-0a7c220e980d-logs\") pod \"bd180b10-c922-4741-bf65-0a7c220e980d\" (UID: \"bd180b10-c922-4741-bf65-0a7c220e980d\") " Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.414681 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd180b10-c922-4741-bf65-0a7c220e980d-logs" (OuterVolumeSpecName: "logs") pod "bd180b10-c922-4741-bf65-0a7c220e980d" (UID: "bd180b10-c922-4741-bf65-0a7c220e980d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.421455 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd180b10-c922-4741-bf65-0a7c220e980d-kube-api-access-68kpc" (OuterVolumeSpecName: "kube-api-access-68kpc") pod "bd180b10-c922-4741-bf65-0a7c220e980d" (UID: "bd180b10-c922-4741-bf65-0a7c220e980d"). InnerVolumeSpecName "kube-api-access-68kpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.429685 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bd180b10-c922-4741-bf65-0a7c220e980d" (UID: "bd180b10-c922-4741-bf65-0a7c220e980d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.447658 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd180b10-c922-4741-bf65-0a7c220e980d" (UID: "bd180b10-c922-4741-bf65-0a7c220e980d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.472433 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-config-data" (OuterVolumeSpecName: "config-data") pod "bd180b10-c922-4741-bf65-0a7c220e980d" (UID: "bd180b10-c922-4741-bf65-0a7c220e980d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.496947 4745 generic.go:334] "Generic (PLEG): container finished" podID="bd180b10-c922-4741-bf65-0a7c220e980d" containerID="69d7836fd32944214e2746269b1e246a91a93c3633eaa6a5f97b70b649aebf64" exitCode=0 Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.497042 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6765469b68-lmphn" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.497042 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6765469b68-lmphn" event={"ID":"bd180b10-c922-4741-bf65-0a7c220e980d","Type":"ContainerDied","Data":"69d7836fd32944214e2746269b1e246a91a93c3633eaa6a5f97b70b649aebf64"} Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.497161 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6765469b68-lmphn" event={"ID":"bd180b10-c922-4741-bf65-0a7c220e980d","Type":"ContainerDied","Data":"851c659be8b6d5ce723578bba8d035d480792b31ca537a70d2092073886089da"} Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.497191 4745 scope.go:117] "RemoveContainer" containerID="69d7836fd32944214e2746269b1e246a91a93c3633eaa6a5f97b70b649aebf64" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.519851 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.519886 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68kpc\" (UniqueName: \"kubernetes.io/projected/bd180b10-c922-4741-bf65-0a7c220e980d-kube-api-access-68kpc\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.519898 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.519911 4745 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd180b10-c922-4741-bf65-0a7c220e980d-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.519925 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd180b10-c922-4741-bf65-0a7c220e980d-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.545355 4745 scope.go:117] "RemoveContainer" containerID="9907a5e0df1ef3b89d42441ae0bbd28fb681911524522f3e45d9dfedc04a6905" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.548290 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6765469b68-lmphn"] Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.569543 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a956face-d831-4e78-b6f5-36fd5dd17d87" path="/var/lib/kubelet/pods/a956face-d831-4e78-b6f5-36fd5dd17d87/volumes" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.570194 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6765469b68-lmphn"] Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.574528 4745 scope.go:117] "RemoveContainer" containerID="69d7836fd32944214e2746269b1e246a91a93c3633eaa6a5f97b70b649aebf64" Dec 09 11:54:45 crc kubenswrapper[4745]: E1209 11:54:45.575714 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d7836fd32944214e2746269b1e246a91a93c3633eaa6a5f97b70b649aebf64\": container with ID starting with 69d7836fd32944214e2746269b1e246a91a93c3633eaa6a5f97b70b649aebf64 not found: ID does not exist" containerID="69d7836fd32944214e2746269b1e246a91a93c3633eaa6a5f97b70b649aebf64" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.575760 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d7836fd32944214e2746269b1e246a91a93c3633eaa6a5f97b70b649aebf64"} err="failed to get container status \"69d7836fd32944214e2746269b1e246a91a93c3633eaa6a5f97b70b649aebf64\": rpc error: code = NotFound desc = could not find container \"69d7836fd32944214e2746269b1e246a91a93c3633eaa6a5f97b70b649aebf64\": container with ID starting with 69d7836fd32944214e2746269b1e246a91a93c3633eaa6a5f97b70b649aebf64 not found: ID does not exist" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.575791 4745 scope.go:117] "RemoveContainer" containerID="9907a5e0df1ef3b89d42441ae0bbd28fb681911524522f3e45d9dfedc04a6905" Dec 09 11:54:45 crc kubenswrapper[4745]: E1209 11:54:45.576317 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9907a5e0df1ef3b89d42441ae0bbd28fb681911524522f3e45d9dfedc04a6905\": container with ID starting with 9907a5e0df1ef3b89d42441ae0bbd28fb681911524522f3e45d9dfedc04a6905 not found: ID does not exist" containerID="9907a5e0df1ef3b89d42441ae0bbd28fb681911524522f3e45d9dfedc04a6905" Dec 09 11:54:45 crc kubenswrapper[4745]: I1209 11:54:45.576415 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9907a5e0df1ef3b89d42441ae0bbd28fb681911524522f3e45d9dfedc04a6905"} err="failed to get container status \"9907a5e0df1ef3b89d42441ae0bbd28fb681911524522f3e45d9dfedc04a6905\": rpc error: code = NotFound desc = could not find container \"9907a5e0df1ef3b89d42441ae0bbd28fb681911524522f3e45d9dfedc04a6905\": container with ID starting with 9907a5e0df1ef3b89d42441ae0bbd28fb681911524522f3e45d9dfedc04a6905 not found: ID does not exist" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.243370 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 09 11:54:46 crc kubenswrapper[4745]: E1209 11:54:46.243920 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a956face-d831-4e78-b6f5-36fd5dd17d87" containerName="neutron-api" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.243937 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="a956face-d831-4e78-b6f5-36fd5dd17d87" containerName="neutron-api" Dec 09 11:54:46 crc kubenswrapper[4745]: E1209 11:54:46.243952 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd180b10-c922-4741-bf65-0a7c220e980d" containerName="barbican-api-log" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.243963 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd180b10-c922-4741-bf65-0a7c220e980d" containerName="barbican-api-log" Dec 09 11:54:46 crc kubenswrapper[4745]: E1209 11:54:46.243996 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a956face-d831-4e78-b6f5-36fd5dd17d87" containerName="neutron-httpd" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.244005 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="a956face-d831-4e78-b6f5-36fd5dd17d87" containerName="neutron-httpd" Dec 09 11:54:46 crc kubenswrapper[4745]: E1209 11:54:46.244018 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd180b10-c922-4741-bf65-0a7c220e980d" containerName="barbican-api" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.244028 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd180b10-c922-4741-bf65-0a7c220e980d" containerName="barbican-api" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.244464 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd180b10-c922-4741-bf65-0a7c220e980d" containerName="barbican-api-log" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.244479 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd180b10-c922-4741-bf65-0a7c220e980d" containerName="barbican-api" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.244527 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="a956face-d831-4e78-b6f5-36fd5dd17d87" containerName="neutron-httpd" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.244543 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="a956face-d831-4e78-b6f5-36fd5dd17d87" containerName="neutron-api" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.245352 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.258468 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.259090 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.259213 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.260591 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-9qv6l" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.333987 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm46b\" (UniqueName: \"kubernetes.io/projected/ac238b52-4167-4847-b66f-6985b784268c-kube-api-access-lm46b\") pod \"openstackclient\" (UID: \"ac238b52-4167-4847-b66f-6985b784268c\") " pod="openstack/openstackclient" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.334159 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ac238b52-4167-4847-b66f-6985b784268c-openstack-config-secret\") pod \"openstackclient\" (UID: \"ac238b52-4167-4847-b66f-6985b784268c\") " pod="openstack/openstackclient" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.334197 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac238b52-4167-4847-b66f-6985b784268c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ac238b52-4167-4847-b66f-6985b784268c\") " pod="openstack/openstackclient" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.334246 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ac238b52-4167-4847-b66f-6985b784268c-openstack-config\") pod \"openstackclient\" (UID: \"ac238b52-4167-4847-b66f-6985b784268c\") " pod="openstack/openstackclient" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.436592 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ac238b52-4167-4847-b66f-6985b784268c-openstack-config-secret\") pod \"openstackclient\" (UID: \"ac238b52-4167-4847-b66f-6985b784268c\") " pod="openstack/openstackclient" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.437444 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac238b52-4167-4847-b66f-6985b784268c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ac238b52-4167-4847-b66f-6985b784268c\") " pod="openstack/openstackclient" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.438923 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ac238b52-4167-4847-b66f-6985b784268c-openstack-config\") pod \"openstackclient\" (UID: \"ac238b52-4167-4847-b66f-6985b784268c\") " pod="openstack/openstackclient" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.439264 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm46b\" (UniqueName: \"kubernetes.io/projected/ac238b52-4167-4847-b66f-6985b784268c-kube-api-access-lm46b\") pod \"openstackclient\" (UID: \"ac238b52-4167-4847-b66f-6985b784268c\") " pod="openstack/openstackclient" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.441551 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ac238b52-4167-4847-b66f-6985b784268c-openstack-config\") pod \"openstackclient\" (UID: \"ac238b52-4167-4847-b66f-6985b784268c\") " pod="openstack/openstackclient" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.443501 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac238b52-4167-4847-b66f-6985b784268c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ac238b52-4167-4847-b66f-6985b784268c\") " pod="openstack/openstackclient" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.444917 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ac238b52-4167-4847-b66f-6985b784268c-openstack-config-secret\") pod \"openstackclient\" (UID: \"ac238b52-4167-4847-b66f-6985b784268c\") " pod="openstack/openstackclient" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.462810 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm46b\" (UniqueName: \"kubernetes.io/projected/ac238b52-4167-4847-b66f-6985b784268c-kube-api-access-lm46b\") pod \"openstackclient\" (UID: \"ac238b52-4167-4847-b66f-6985b784268c\") " pod="openstack/openstackclient" Dec 09 11:54:46 crc kubenswrapper[4745]: I1209 11:54:46.565805 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 11:54:47 crc kubenswrapper[4745]: I1209 11:54:47.046108 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 09 11:54:47 crc kubenswrapper[4745]: W1209 11:54:47.050209 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac238b52_4167_4847_b66f_6985b784268c.slice/crio-c7ea4f241fb99a21460ae0371bdf10177d9dbf536f3ab64edd5cbc4a3179e824 WatchSource:0}: Error finding container c7ea4f241fb99a21460ae0371bdf10177d9dbf536f3ab64edd5cbc4a3179e824: Status 404 returned error can't find the container with id c7ea4f241fb99a21460ae0371bdf10177d9dbf536f3ab64edd5cbc4a3179e824 Dec 09 11:54:47 crc kubenswrapper[4745]: I1209 11:54:47.520472 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ac238b52-4167-4847-b66f-6985b784268c","Type":"ContainerStarted","Data":"c7ea4f241fb99a21460ae0371bdf10177d9dbf536f3ab64edd5cbc4a3179e824"} Dec 09 11:54:47 crc kubenswrapper[4745]: I1209 11:54:47.566731 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd180b10-c922-4741-bf65-0a7c220e980d" path="/var/lib/kubelet/pods/bd180b10-c922-4741-bf65-0a7c220e980d/volumes" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.115611 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5c99cd79f9-qgnsc"] Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.118560 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.121807 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.122067 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.122267 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.139681 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c99cd79f9-qgnsc"] Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.215250 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56e7dac8-1382-446b-88c7-47104b5a89cf-log-httpd\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.215304 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-combined-ca-bundle\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.215396 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-config-data\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.215419 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf8gr\" (UniqueName: \"kubernetes.io/projected/56e7dac8-1382-446b-88c7-47104b5a89cf-kube-api-access-mf8gr\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.215440 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56e7dac8-1382-446b-88c7-47104b5a89cf-run-httpd\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.215501 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/56e7dac8-1382-446b-88c7-47104b5a89cf-etc-swift\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.215792 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-public-tls-certs\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.215970 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-internal-tls-certs\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.317865 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-config-data\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.317953 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf8gr\" (UniqueName: \"kubernetes.io/projected/56e7dac8-1382-446b-88c7-47104b5a89cf-kube-api-access-mf8gr\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.317989 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56e7dac8-1382-446b-88c7-47104b5a89cf-run-httpd\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.318055 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/56e7dac8-1382-446b-88c7-47104b5a89cf-etc-swift\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.318091 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-public-tls-certs\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.318112 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-internal-tls-certs\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.318145 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56e7dac8-1382-446b-88c7-47104b5a89cf-log-httpd\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.318164 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-combined-ca-bundle\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.318706 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56e7dac8-1382-446b-88c7-47104b5a89cf-run-httpd\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.319297 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56e7dac8-1382-446b-88c7-47104b5a89cf-log-httpd\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.325225 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-combined-ca-bundle\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.325789 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-config-data\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.329295 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/56e7dac8-1382-446b-88c7-47104b5a89cf-etc-swift\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.329772 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-public-tls-certs\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.339356 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-internal-tls-certs\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.342162 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf8gr\" (UniqueName: \"kubernetes.io/projected/56e7dac8-1382-446b-88c7-47104b5a89cf-kube-api-access-mf8gr\") pod \"swift-proxy-5c99cd79f9-qgnsc\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.468777 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:50 crc kubenswrapper[4745]: I1209 11:54:50.496191 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 09 11:54:51 crc kubenswrapper[4745]: I1209 11:54:51.096973 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c99cd79f9-qgnsc"] Dec 09 11:54:51 crc kubenswrapper[4745]: I1209 11:54:51.936775 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:54:51 crc kubenswrapper[4745]: I1209 11:54:51.940059 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerName="ceilometer-central-agent" containerID="cri-o://c8be65f5b1f5caae49f2e5b50c120cdc59cc18d82458533bb9e001ff091126c4" gracePeriod=30 Dec 09 11:54:51 crc kubenswrapper[4745]: I1209 11:54:51.940113 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerName="ceilometer-notification-agent" containerID="cri-o://4db7c6576d85acf8c4612b1c4a6a42c810d3ce10bc0382ac5b890b6fb7e8eb45" gracePeriod=30 Dec 09 11:54:51 crc kubenswrapper[4745]: I1209 11:54:51.940082 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerName="proxy-httpd" containerID="cri-o://01643454eadbd303ceec170681409d19446b0d2e929575349a5418d564e8a107" gracePeriod=30 Dec 09 11:54:51 crc kubenswrapper[4745]: I1209 11:54:51.940169 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerName="sg-core" containerID="cri-o://0b837e94af99bb7ed0cf1b3cc865496d2bc79d7b765b81e6f10d82d5d87a798e" gracePeriod=30 Dec 09 11:54:51 crc kubenswrapper[4745]: I1209 11:54:51.946469 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 09 11:54:52 crc kubenswrapper[4745]: I1209 11:54:52.594801 4745 generic.go:334] "Generic (PLEG): container finished" podID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerID="01643454eadbd303ceec170681409d19446b0d2e929575349a5418d564e8a107" exitCode=0 Dec 09 11:54:52 crc kubenswrapper[4745]: I1209 11:54:52.595388 4745 generic.go:334] "Generic (PLEG): container finished" podID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerID="0b837e94af99bb7ed0cf1b3cc865496d2bc79d7b765b81e6f10d82d5d87a798e" exitCode=2 Dec 09 11:54:52 crc kubenswrapper[4745]: I1209 11:54:52.595402 4745 generic.go:334] "Generic (PLEG): container finished" podID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerID="c8be65f5b1f5caae49f2e5b50c120cdc59cc18d82458533bb9e001ff091126c4" exitCode=0 Dec 09 11:54:52 crc kubenswrapper[4745]: I1209 11:54:52.595069 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"04ca535f-9ded-4cd8-bdd7-024d124dc87a","Type":"ContainerDied","Data":"01643454eadbd303ceec170681409d19446b0d2e929575349a5418d564e8a107"} Dec 09 11:54:52 crc kubenswrapper[4745]: I1209 11:54:52.595454 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"04ca535f-9ded-4cd8-bdd7-024d124dc87a","Type":"ContainerDied","Data":"0b837e94af99bb7ed0cf1b3cc865496d2bc79d7b765b81e6f10d82d5d87a798e"} Dec 09 11:54:52 crc kubenswrapper[4745]: I1209 11:54:52.595471 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"04ca535f-9ded-4cd8-bdd7-024d124dc87a","Type":"ContainerDied","Data":"c8be65f5b1f5caae49f2e5b50c120cdc59cc18d82458533bb9e001ff091126c4"} Dec 09 11:54:54 crc kubenswrapper[4745]: I1209 11:54:54.629551 4745 generic.go:334] "Generic (PLEG): container finished" podID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerID="4db7c6576d85acf8c4612b1c4a6a42c810d3ce10bc0382ac5b890b6fb7e8eb45" exitCode=0 Dec 09 11:54:54 crc kubenswrapper[4745]: I1209 11:54:54.629662 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"04ca535f-9ded-4cd8-bdd7-024d124dc87a","Type":"ContainerDied","Data":"4db7c6576d85acf8c4612b1c4a6a42c810d3ce10bc0382ac5b890b6fb7e8eb45"} Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.667172 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" event={"ID":"56e7dac8-1382-446b-88c7-47104b5a89cf","Type":"ContainerStarted","Data":"4ba713996b77f2d4999979b9b73f7452cf113d5729d70af3dbe6b23cede34a3b"} Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.764035 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.817199 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-scripts\") pod \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.817294 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04ca535f-9ded-4cd8-bdd7-024d124dc87a-log-httpd\") pod \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.820038 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04ca535f-9ded-4cd8-bdd7-024d124dc87a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "04ca535f-9ded-4cd8-bdd7-024d124dc87a" (UID: "04ca535f-9ded-4cd8-bdd7-024d124dc87a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.820263 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvlh5\" (UniqueName: \"kubernetes.io/projected/04ca535f-9ded-4cd8-bdd7-024d124dc87a-kube-api-access-gvlh5\") pod \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.820680 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-config-data\") pod \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.821060 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04ca535f-9ded-4cd8-bdd7-024d124dc87a-run-httpd\") pod \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.821221 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-combined-ca-bundle\") pod \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.821254 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-sg-core-conf-yaml\") pod \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\" (UID: \"04ca535f-9ded-4cd8-bdd7-024d124dc87a\") " Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.821488 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04ca535f-9ded-4cd8-bdd7-024d124dc87a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "04ca535f-9ded-4cd8-bdd7-024d124dc87a" (UID: "04ca535f-9ded-4cd8-bdd7-024d124dc87a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.822094 4745 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04ca535f-9ded-4cd8-bdd7-024d124dc87a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.822131 4745 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04ca535f-9ded-4cd8-bdd7-024d124dc87a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.826024 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ca535f-9ded-4cd8-bdd7-024d124dc87a-kube-api-access-gvlh5" (OuterVolumeSpecName: "kube-api-access-gvlh5") pod "04ca535f-9ded-4cd8-bdd7-024d124dc87a" (UID: "04ca535f-9ded-4cd8-bdd7-024d124dc87a"). InnerVolumeSpecName "kube-api-access-gvlh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.826288 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-scripts" (OuterVolumeSpecName: "scripts") pod "04ca535f-9ded-4cd8-bdd7-024d124dc87a" (UID: "04ca535f-9ded-4cd8-bdd7-024d124dc87a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.856269 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "04ca535f-9ded-4cd8-bdd7-024d124dc87a" (UID: "04ca535f-9ded-4cd8-bdd7-024d124dc87a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.924426 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.924464 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvlh5\" (UniqueName: \"kubernetes.io/projected/04ca535f-9ded-4cd8-bdd7-024d124dc87a-kube-api-access-gvlh5\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.924475 4745 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.928095 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04ca535f-9ded-4cd8-bdd7-024d124dc87a" (UID: "04ca535f-9ded-4cd8-bdd7-024d124dc87a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:57 crc kubenswrapper[4745]: I1209 11:54:57.939449 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-config-data" (OuterVolumeSpecName: "config-data") pod "04ca535f-9ded-4cd8-bdd7-024d124dc87a" (UID: "04ca535f-9ded-4cd8-bdd7-024d124dc87a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.026878 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.026921 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ca535f-9ded-4cd8-bdd7-024d124dc87a-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.691607 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"04ca535f-9ded-4cd8-bdd7-024d124dc87a","Type":"ContainerDied","Data":"202e5cdf65cfc511eb57dba2b2a8d5b4c85ee50e8ee2b614dade06dccaf1ed64"} Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.691958 4745 scope.go:117] "RemoveContainer" containerID="01643454eadbd303ceec170681409d19446b0d2e929575349a5418d564e8a107" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.691618 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.704281 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" event={"ID":"56e7dac8-1382-446b-88c7-47104b5a89cf","Type":"ContainerStarted","Data":"5b3b68afb88cc6f2e0bb3746cc7c964f36672819d43cd68d83781dceeba6aa46"} Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.704385 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" event={"ID":"56e7dac8-1382-446b-88c7-47104b5a89cf","Type":"ContainerStarted","Data":"324e724346a04d819890081d5a91493f6e1feb208d14a31f346c568761a5cbf4"} Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.704452 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.704482 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.713918 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ac238b52-4167-4847-b66f-6985b784268c","Type":"ContainerStarted","Data":"dc6f10acb25d86858fe8f0ee4fbcf99c84e675bea1bcdd1d53f89781a1482c42"} Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.716874 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-mndrs"] Dec 09 11:54:58 crc kubenswrapper[4745]: E1209 11:54:58.718781 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerName="ceilometer-central-agent" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.718812 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerName="ceilometer-central-agent" Dec 09 11:54:58 crc kubenswrapper[4745]: E1209 11:54:58.718844 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerName="proxy-httpd" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.718852 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerName="proxy-httpd" Dec 09 11:54:58 crc kubenswrapper[4745]: E1209 11:54:58.718867 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerName="ceilometer-notification-agent" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.718874 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerName="ceilometer-notification-agent" Dec 09 11:54:58 crc kubenswrapper[4745]: E1209 11:54:58.718899 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerName="sg-core" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.718909 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerName="sg-core" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.719195 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerName="sg-core" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.719238 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerName="ceilometer-notification-agent" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.719248 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerName="ceilometer-central-agent" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.719264 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" containerName="proxy-httpd" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.720174 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mndrs" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.742785 4745 scope.go:117] "RemoveContainer" containerID="0b837e94af99bb7ed0cf1b3cc865496d2bc79d7b765b81e6f10d82d5d87a798e" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.745890 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mndrs"] Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.746772 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" podStartSLOduration=8.746752126 podStartE2EDuration="8.746752126s" podCreationTimestamp="2025-12-09 11:54:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:54:58.735497033 +0000 UTC m=+1385.560698557" watchObservedRunningTime="2025-12-09 11:54:58.746752126 +0000 UTC m=+1385.571953650" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.811096 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.820619 4745 scope.go:117] "RemoveContainer" containerID="4db7c6576d85acf8c4612b1c4a6a42c810d3ce10bc0382ac5b890b6fb7e8eb45" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.831869 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.846428 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t9k4\" (UniqueName: \"kubernetes.io/projected/a358310a-9254-4ca9-865c-4af37de2791a-kube-api-access-4t9k4\") pod \"nova-api-db-create-mndrs\" (UID: \"a358310a-9254-4ca9-865c-4af37de2791a\") " pod="openstack/nova-api-db-create-mndrs" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.846525 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a358310a-9254-4ca9-865c-4af37de2791a-operator-scripts\") pod \"nova-api-db-create-mndrs\" (UID: \"a358310a-9254-4ca9-865c-4af37de2791a\") " pod="openstack/nova-api-db-create-mndrs" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.856955 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qq6pt"] Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.858901 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qq6pt" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.863482 4745 scope.go:117] "RemoveContainer" containerID="c8be65f5b1f5caae49f2e5b50c120cdc59cc18d82458533bb9e001ff091126c4" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.888123 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-03c2-account-create-update-4546z"] Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.889423 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-03c2-account-create-update-4546z" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.900771 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.914222 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.916731 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.923105 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.923325 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.938823 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qq6pt"] Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.948233 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb863088-6993-45cc-8d6a-bd1a7d6f403a-operator-scripts\") pod \"nova-cell0-db-create-qq6pt\" (UID: \"eb863088-6993-45cc-8d6a-bd1a7d6f403a\") " pod="openstack/nova-cell0-db-create-qq6pt" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.948299 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t9k4\" (UniqueName: \"kubernetes.io/projected/a358310a-9254-4ca9-865c-4af37de2791a-kube-api-access-4t9k4\") pod \"nova-api-db-create-mndrs\" (UID: \"a358310a-9254-4ca9-865c-4af37de2791a\") " pod="openstack/nova-api-db-create-mndrs" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.948333 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a358310a-9254-4ca9-865c-4af37de2791a-operator-scripts\") pod \"nova-api-db-create-mndrs\" (UID: \"a358310a-9254-4ca9-865c-4af37de2791a\") " pod="openstack/nova-api-db-create-mndrs" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.948357 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2018b5a5-625c-4595-a26f-1f4a6df2bb90-operator-scripts\") pod \"nova-api-03c2-account-create-update-4546z\" (UID: \"2018b5a5-625c-4595-a26f-1f4a6df2bb90\") " pod="openstack/nova-api-03c2-account-create-update-4546z" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.948425 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2fm6\" (UniqueName: \"kubernetes.io/projected/2018b5a5-625c-4595-a26f-1f4a6df2bb90-kube-api-access-q2fm6\") pod \"nova-api-03c2-account-create-update-4546z\" (UID: \"2018b5a5-625c-4595-a26f-1f4a6df2bb90\") " pod="openstack/nova-api-03c2-account-create-update-4546z" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.948471 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbpnl\" (UniqueName: \"kubernetes.io/projected/eb863088-6993-45cc-8d6a-bd1a7d6f403a-kube-api-access-gbpnl\") pod \"nova-cell0-db-create-qq6pt\" (UID: \"eb863088-6993-45cc-8d6a-bd1a7d6f403a\") " pod="openstack/nova-cell0-db-create-qq6pt" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.954751 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a358310a-9254-4ca9-865c-4af37de2791a-operator-scripts\") pod \"nova-api-db-create-mndrs\" (UID: \"a358310a-9254-4ca9-865c-4af37de2791a\") " pod="openstack/nova-api-db-create-mndrs" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.961211 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.971496 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.461443097 podStartE2EDuration="12.971473637s" podCreationTimestamp="2025-12-09 11:54:46 +0000 UTC" firstStartedPulling="2025-12-09 11:54:47.052599342 +0000 UTC m=+1373.877800866" lastFinishedPulling="2025-12-09 11:54:57.562629882 +0000 UTC m=+1384.387831406" observedRunningTime="2025-12-09 11:54:58.863937841 +0000 UTC m=+1385.689139365" watchObservedRunningTime="2025-12-09 11:54:58.971473637 +0000 UTC m=+1385.796675161" Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.989273 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-03c2-account-create-update-4546z"] Dec 09 11:54:58 crc kubenswrapper[4745]: I1209 11:54:58.995460 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t9k4\" (UniqueName: \"kubernetes.io/projected/a358310a-9254-4ca9-865c-4af37de2791a-kube-api-access-4t9k4\") pod \"nova-api-db-create-mndrs\" (UID: \"a358310a-9254-4ca9-865c-4af37de2791a\") " pod="openstack/nova-api-db-create-mndrs" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.056037 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-log-httpd\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.056204 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-run-httpd\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.056243 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2fm6\" (UniqueName: \"kubernetes.io/projected/2018b5a5-625c-4595-a26f-1f4a6df2bb90-kube-api-access-q2fm6\") pod \"nova-api-03c2-account-create-update-4546z\" (UID: \"2018b5a5-625c-4595-a26f-1f4a6df2bb90\") " pod="openstack/nova-api-03c2-account-create-update-4546z" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.056332 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbpnl\" (UniqueName: \"kubernetes.io/projected/eb863088-6993-45cc-8d6a-bd1a7d6f403a-kube-api-access-gbpnl\") pod \"nova-cell0-db-create-qq6pt\" (UID: \"eb863088-6993-45cc-8d6a-bd1a7d6f403a\") " pod="openstack/nova-cell0-db-create-qq6pt" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.056422 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c667h\" (UniqueName: \"kubernetes.io/projected/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-kube-api-access-c667h\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.056447 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-scripts\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.056489 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb863088-6993-45cc-8d6a-bd1a7d6f403a-operator-scripts\") pod \"nova-cell0-db-create-qq6pt\" (UID: \"eb863088-6993-45cc-8d6a-bd1a7d6f403a\") " pod="openstack/nova-cell0-db-create-qq6pt" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.056542 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.056594 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-config-data\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.056647 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.056679 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2018b5a5-625c-4595-a26f-1f4a6df2bb90-operator-scripts\") pod \"nova-api-03c2-account-create-update-4546z\" (UID: \"2018b5a5-625c-4595-a26f-1f4a6df2bb90\") " pod="openstack/nova-api-03c2-account-create-update-4546z" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.057908 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2018b5a5-625c-4595-a26f-1f4a6df2bb90-operator-scripts\") pod \"nova-api-03c2-account-create-update-4546z\" (UID: \"2018b5a5-625c-4595-a26f-1f4a6df2bb90\") " pod="openstack/nova-api-03c2-account-create-update-4546z" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.059248 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb863088-6993-45cc-8d6a-bd1a7d6f403a-operator-scripts\") pod \"nova-cell0-db-create-qq6pt\" (UID: \"eb863088-6993-45cc-8d6a-bd1a7d6f403a\") " pod="openstack/nova-cell0-db-create-qq6pt" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.067498 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mndrs" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.080599 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2fm6\" (UniqueName: \"kubernetes.io/projected/2018b5a5-625c-4595-a26f-1f4a6df2bb90-kube-api-access-q2fm6\") pod \"nova-api-03c2-account-create-update-4546z\" (UID: \"2018b5a5-625c-4595-a26f-1f4a6df2bb90\") " pod="openstack/nova-api-03c2-account-create-update-4546z" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.094111 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbpnl\" (UniqueName: \"kubernetes.io/projected/eb863088-6993-45cc-8d6a-bd1a7d6f403a-kube-api-access-gbpnl\") pod \"nova-cell0-db-create-qq6pt\" (UID: \"eb863088-6993-45cc-8d6a-bd1a7d6f403a\") " pod="openstack/nova-cell0-db-create-qq6pt" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.126932 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-vhzt5"] Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.146733 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vhzt5" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.147569 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-e536-account-create-update-bnmp6"] Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.149682 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e536-account-create-update-bnmp6" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.155932 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.156159 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vhzt5"] Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.160805 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c667h\" (UniqueName: \"kubernetes.io/projected/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-kube-api-access-c667h\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.160883 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-scripts\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.160925 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.160962 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-config-data\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.160983 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.161014 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-log-httpd\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.161060 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-run-httpd\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.161553 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-run-httpd\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.166111 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-log-httpd\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.168889 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e536-account-create-update-bnmp6"] Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.174654 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-scripts\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.174934 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-config-data\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.186451 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.198245 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.213332 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c667h\" (UniqueName: \"kubernetes.io/projected/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-kube-api-access-c667h\") pod \"ceilometer-0\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.213871 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qq6pt" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.243148 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-03c2-account-create-update-4546z" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.277632 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.279883 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43837717-2618-45c9-afac-172826694ae5-operator-scripts\") pod \"nova-cell1-db-create-vhzt5\" (UID: \"43837717-2618-45c9-afac-172826694ae5\") " pod="openstack/nova-cell1-db-create-vhzt5" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.280085 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k5xv\" (UniqueName: \"kubernetes.io/projected/43837717-2618-45c9-afac-172826694ae5-kube-api-access-5k5xv\") pod \"nova-cell1-db-create-vhzt5\" (UID: \"43837717-2618-45c9-afac-172826694ae5\") " pod="openstack/nova-cell1-db-create-vhzt5" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.280224 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40d08272-d758-4404-aae4-a64897dfbab8-operator-scripts\") pod \"nova-cell0-e536-account-create-update-bnmp6\" (UID: \"40d08272-d758-4404-aae4-a64897dfbab8\") " pod="openstack/nova-cell0-e536-account-create-update-bnmp6" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.280309 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd728\" (UniqueName: \"kubernetes.io/projected/40d08272-d758-4404-aae4-a64897dfbab8-kube-api-access-vd728\") pod \"nova-cell0-e536-account-create-update-bnmp6\" (UID: \"40d08272-d758-4404-aae4-a64897dfbab8\") " pod="openstack/nova-cell0-e536-account-create-update-bnmp6" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.341582 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-d49f-account-create-update-kqsjx"] Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.342923 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d49f-account-create-update-kqsjx" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.366993 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.383143 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k5xv\" (UniqueName: \"kubernetes.io/projected/43837717-2618-45c9-afac-172826694ae5-kube-api-access-5k5xv\") pod \"nova-cell1-db-create-vhzt5\" (UID: \"43837717-2618-45c9-afac-172826694ae5\") " pod="openstack/nova-cell1-db-create-vhzt5" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.383242 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40d08272-d758-4404-aae4-a64897dfbab8-operator-scripts\") pod \"nova-cell0-e536-account-create-update-bnmp6\" (UID: \"40d08272-d758-4404-aae4-a64897dfbab8\") " pod="openstack/nova-cell0-e536-account-create-update-bnmp6" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.383271 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd728\" (UniqueName: \"kubernetes.io/projected/40d08272-d758-4404-aae4-a64897dfbab8-kube-api-access-vd728\") pod \"nova-cell0-e536-account-create-update-bnmp6\" (UID: \"40d08272-d758-4404-aae4-a64897dfbab8\") " pod="openstack/nova-cell0-e536-account-create-update-bnmp6" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.383354 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43837717-2618-45c9-afac-172826694ae5-operator-scripts\") pod \"nova-cell1-db-create-vhzt5\" (UID: \"43837717-2618-45c9-afac-172826694ae5\") " pod="openstack/nova-cell1-db-create-vhzt5" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.384148 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43837717-2618-45c9-afac-172826694ae5-operator-scripts\") pod \"nova-cell1-db-create-vhzt5\" (UID: \"43837717-2618-45c9-afac-172826694ae5\") " pod="openstack/nova-cell1-db-create-vhzt5" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.386063 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40d08272-d758-4404-aae4-a64897dfbab8-operator-scripts\") pod \"nova-cell0-e536-account-create-update-bnmp6\" (UID: \"40d08272-d758-4404-aae4-a64897dfbab8\") " pod="openstack/nova-cell0-e536-account-create-update-bnmp6" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.394603 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d49f-account-create-update-kqsjx"] Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.460247 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k5xv\" (UniqueName: \"kubernetes.io/projected/43837717-2618-45c9-afac-172826694ae5-kube-api-access-5k5xv\") pod \"nova-cell1-db-create-vhzt5\" (UID: \"43837717-2618-45c9-afac-172826694ae5\") " pod="openstack/nova-cell1-db-create-vhzt5" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.482243 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd728\" (UniqueName: \"kubernetes.io/projected/40d08272-d758-4404-aae4-a64897dfbab8-kube-api-access-vd728\") pod \"nova-cell0-e536-account-create-update-bnmp6\" (UID: \"40d08272-d758-4404-aae4-a64897dfbab8\") " pod="openstack/nova-cell0-e536-account-create-update-bnmp6" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.485628 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56txb\" (UniqueName: \"kubernetes.io/projected/ea1ee906-8314-4be0-9845-f8abfe129175-kube-api-access-56txb\") pod \"nova-cell1-d49f-account-create-update-kqsjx\" (UID: \"ea1ee906-8314-4be0-9845-f8abfe129175\") " pod="openstack/nova-cell1-d49f-account-create-update-kqsjx" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.485850 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea1ee906-8314-4be0-9845-f8abfe129175-operator-scripts\") pod \"nova-cell1-d49f-account-create-update-kqsjx\" (UID: \"ea1ee906-8314-4be0-9845-f8abfe129175\") " pod="openstack/nova-cell1-d49f-account-create-update-kqsjx" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.592092 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea1ee906-8314-4be0-9845-f8abfe129175-operator-scripts\") pod \"nova-cell1-d49f-account-create-update-kqsjx\" (UID: \"ea1ee906-8314-4be0-9845-f8abfe129175\") " pod="openstack/nova-cell1-d49f-account-create-update-kqsjx" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.592204 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56txb\" (UniqueName: \"kubernetes.io/projected/ea1ee906-8314-4be0-9845-f8abfe129175-kube-api-access-56txb\") pod \"nova-cell1-d49f-account-create-update-kqsjx\" (UID: \"ea1ee906-8314-4be0-9845-f8abfe129175\") " pod="openstack/nova-cell1-d49f-account-create-update-kqsjx" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.594335 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea1ee906-8314-4be0-9845-f8abfe129175-operator-scripts\") pod \"nova-cell1-d49f-account-create-update-kqsjx\" (UID: \"ea1ee906-8314-4be0-9845-f8abfe129175\") " pod="openstack/nova-cell1-d49f-account-create-update-kqsjx" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.604482 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vhzt5" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.651058 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e536-account-create-update-bnmp6" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.710336 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04ca535f-9ded-4cd8-bdd7-024d124dc87a" path="/var/lib/kubelet/pods/04ca535f-9ded-4cd8-bdd7-024d124dc87a/volumes" Dec 09 11:54:59 crc kubenswrapper[4745]: I1209 11:54:59.737297 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56txb\" (UniqueName: \"kubernetes.io/projected/ea1ee906-8314-4be0-9845-f8abfe129175-kube-api-access-56txb\") pod \"nova-cell1-d49f-account-create-update-kqsjx\" (UID: \"ea1ee906-8314-4be0-9845-f8abfe129175\") " pod="openstack/nova-cell1-d49f-account-create-update-kqsjx" Dec 09 11:55:00 crc kubenswrapper[4745]: I1209 11:55:00.007599 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d49f-account-create-update-kqsjx" Dec 09 11:55:00 crc kubenswrapper[4745]: I1209 11:55:00.340294 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mndrs"] Dec 09 11:55:00 crc kubenswrapper[4745]: I1209 11:55:00.549134 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qq6pt"] Dec 09 11:55:00 crc kubenswrapper[4745]: I1209 11:55:00.597966 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:00 crc kubenswrapper[4745]: I1209 11:55:00.607061 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-03c2-account-create-update-4546z"] Dec 09 11:55:00 crc kubenswrapper[4745]: I1209 11:55:00.618526 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e536-account-create-update-bnmp6"] Dec 09 11:55:00 crc kubenswrapper[4745]: I1209 11:55:00.721561 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vhzt5"] Dec 09 11:55:00 crc kubenswrapper[4745]: I1209 11:55:00.935771 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.424035 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d49f-account-create-update-kqsjx"] Dec 09 11:55:01 crc kubenswrapper[4745]: E1209 11:55:01.566563 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76672101_82d9_4d42_b793_6fa33ce5c91a.slice/crio-a2a411d25b847b9ee2ed11822ecd1b85da7d9379b7d75783aba79718fee4c56a.scope\": RecentStats: unable to find data in memory cache]" Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.772835 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qq6pt" event={"ID":"eb863088-6993-45cc-8d6a-bd1a7d6f403a","Type":"ContainerStarted","Data":"490ddfd6330b9b3002fed97b1c7f9089c9bcf9d6419e599757b54a76e8d3e14a"} Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.773349 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qq6pt" event={"ID":"eb863088-6993-45cc-8d6a-bd1a7d6f403a","Type":"ContainerStarted","Data":"be659393e8f0ce48536bffe9de9c1158ef11bf7b069462ea7649724fcdb8d96d"} Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.780204 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d49f-account-create-update-kqsjx" event={"ID":"ea1ee906-8314-4be0-9845-f8abfe129175","Type":"ContainerStarted","Data":"ee17a952629dfbec967a9919d6ae8806d7f786cd8c251e54eac8ba1f8923275e"} Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.787843 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mndrs" event={"ID":"a358310a-9254-4ca9-865c-4af37de2791a","Type":"ContainerStarted","Data":"0bbc9404782fc19ece5c2c4a2c4e395165e19b288eba5d996aad423d039b8dbe"} Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.787908 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mndrs" event={"ID":"a358310a-9254-4ca9-865c-4af37de2791a","Type":"ContainerStarted","Data":"53afa50637f1833a4648bd0a61746e8204a5e2b4d528416690ad18c6961d7a55"} Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.791059 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c3dd1f-4ab9-4d80-b46c-0fc94318b410","Type":"ContainerStarted","Data":"3a51605f3d4db0df92908e29b976943e955add46b97450c4e60eccd50cdc7c14"} Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.795391 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-03c2-account-create-update-4546z" event={"ID":"2018b5a5-625c-4595-a26f-1f4a6df2bb90","Type":"ContainerStarted","Data":"697de4a9fded904a811d04612b2ec365c4c14861fd0b1d3d145deddded6bd815"} Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.795467 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-03c2-account-create-update-4546z" event={"ID":"2018b5a5-625c-4595-a26f-1f4a6df2bb90","Type":"ContainerStarted","Data":"1936d8323d012c85ee90cd17a324bd51defe561fa2a166545d2a5dd2bf21ef09"} Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.804658 4745 generic.go:334] "Generic (PLEG): container finished" podID="76672101-82d9-4d42-b793-6fa33ce5c91a" containerID="a2a411d25b847b9ee2ed11822ecd1b85da7d9379b7d75783aba79718fee4c56a" exitCode=137 Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.804748 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"76672101-82d9-4d42-b793-6fa33ce5c91a","Type":"ContainerDied","Data":"a2a411d25b847b9ee2ed11822ecd1b85da7d9379b7d75783aba79718fee4c56a"} Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.807589 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-qq6pt" podStartSLOduration=3.807246225 podStartE2EDuration="3.807246225s" podCreationTimestamp="2025-12-09 11:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:55:01.793955207 +0000 UTC m=+1388.619156721" watchObservedRunningTime="2025-12-09 11:55:01.807246225 +0000 UTC m=+1388.632447749" Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.816619 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e536-account-create-update-bnmp6" event={"ID":"40d08272-d758-4404-aae4-a64897dfbab8","Type":"ContainerStarted","Data":"f59d16e21842e09e539e7cb40ecdfd9188979d63816912780cb72eb300e3fd22"} Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.816704 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e536-account-create-update-bnmp6" event={"ID":"40d08272-d758-4404-aae4-a64897dfbab8","Type":"ContainerStarted","Data":"46226b9ffb5988e8219e0525679bbc71592b889cd06eda1d8842f7dc987acbb7"} Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.826066 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vhzt5" event={"ID":"43837717-2618-45c9-afac-172826694ae5","Type":"ContainerStarted","Data":"09ad1c05b19779977f458ccdf6da3b4f7981fb3e1c55e62d2c5d17dbd9fbfe34"} Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.826135 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vhzt5" event={"ID":"43837717-2618-45c9-afac-172826694ae5","Type":"ContainerStarted","Data":"fab4eac75902e5206111ebbbaa5ca85e3d3c4daaf0007360160ca30a847a7fb2"} Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.844368 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-e536-account-create-update-bnmp6" podStartSLOduration=3.844347004 podStartE2EDuration="3.844347004s" podCreationTimestamp="2025-12-09 11:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:55:01.841056726 +0000 UTC m=+1388.666258250" watchObservedRunningTime="2025-12-09 11:55:01.844347004 +0000 UTC m=+1388.669548528" Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.846374 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-mndrs" podStartSLOduration=3.846362929 podStartE2EDuration="3.846362929s" podCreationTimestamp="2025-12-09 11:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:55:01.818031386 +0000 UTC m=+1388.643232910" watchObservedRunningTime="2025-12-09 11:55:01.846362929 +0000 UTC m=+1388.671564453" Dec 09 11:55:01 crc kubenswrapper[4745]: I1209 11:55:01.869119 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-03c2-account-create-update-4546z" podStartSLOduration=3.869098321 podStartE2EDuration="3.869098321s" podCreationTimestamp="2025-12-09 11:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:55:01.861244009 +0000 UTC m=+1388.686445543" watchObservedRunningTime="2025-12-09 11:55:01.869098321 +0000 UTC m=+1388.694299845" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.077822 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-vhzt5" podStartSLOduration=4.077786369 podStartE2EDuration="4.077786369s" podCreationTimestamp="2025-12-09 11:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:55:01.884551697 +0000 UTC m=+1388.709753241" watchObservedRunningTime="2025-12-09 11:55:02.077786369 +0000 UTC m=+1388.902987893" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.086804 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.382662 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.472288 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-scripts\") pod \"76672101-82d9-4d42-b793-6fa33ce5c91a\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.472887 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/76672101-82d9-4d42-b793-6fa33ce5c91a-etc-machine-id\") pod \"76672101-82d9-4d42-b793-6fa33ce5c91a\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.473024 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-combined-ca-bundle\") pod \"76672101-82d9-4d42-b793-6fa33ce5c91a\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.473089 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76672101-82d9-4d42-b793-6fa33ce5c91a-logs\") pod \"76672101-82d9-4d42-b793-6fa33ce5c91a\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.473082 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76672101-82d9-4d42-b793-6fa33ce5c91a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "76672101-82d9-4d42-b793-6fa33ce5c91a" (UID: "76672101-82d9-4d42-b793-6fa33ce5c91a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.473133 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx9jn\" (UniqueName: \"kubernetes.io/projected/76672101-82d9-4d42-b793-6fa33ce5c91a-kube-api-access-tx9jn\") pod \"76672101-82d9-4d42-b793-6fa33ce5c91a\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.473190 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-config-data-custom\") pod \"76672101-82d9-4d42-b793-6fa33ce5c91a\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.473238 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-config-data\") pod \"76672101-82d9-4d42-b793-6fa33ce5c91a\" (UID: \"76672101-82d9-4d42-b793-6fa33ce5c91a\") " Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.473628 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76672101-82d9-4d42-b793-6fa33ce5c91a-logs" (OuterVolumeSpecName: "logs") pod "76672101-82d9-4d42-b793-6fa33ce5c91a" (UID: "76672101-82d9-4d42-b793-6fa33ce5c91a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.473635 4745 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/76672101-82d9-4d42-b793-6fa33ce5c91a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.484694 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "76672101-82d9-4d42-b793-6fa33ce5c91a" (UID: "76672101-82d9-4d42-b793-6fa33ce5c91a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.498691 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76672101-82d9-4d42-b793-6fa33ce5c91a-kube-api-access-tx9jn" (OuterVolumeSpecName: "kube-api-access-tx9jn") pod "76672101-82d9-4d42-b793-6fa33ce5c91a" (UID: "76672101-82d9-4d42-b793-6fa33ce5c91a"). InnerVolumeSpecName "kube-api-access-tx9jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.534708 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-scripts" (OuterVolumeSpecName: "scripts") pod "76672101-82d9-4d42-b793-6fa33ce5c91a" (UID: "76672101-82d9-4d42-b793-6fa33ce5c91a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.586469 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76672101-82d9-4d42-b793-6fa33ce5c91a" (UID: "76672101-82d9-4d42-b793-6fa33ce5c91a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.587015 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.587047 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.587057 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76672101-82d9-4d42-b793-6fa33ce5c91a-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.587066 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx9jn\" (UniqueName: \"kubernetes.io/projected/76672101-82d9-4d42-b793-6fa33ce5c91a-kube-api-access-tx9jn\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.587081 4745 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.675742 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-config-data" (OuterVolumeSpecName: "config-data") pod "76672101-82d9-4d42-b793-6fa33ce5c91a" (UID: "76672101-82d9-4d42-b793-6fa33ce5c91a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.690318 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76672101-82d9-4d42-b793-6fa33ce5c91a-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.869984 4745 generic.go:334] "Generic (PLEG): container finished" podID="40d08272-d758-4404-aae4-a64897dfbab8" containerID="f59d16e21842e09e539e7cb40ecdfd9188979d63816912780cb72eb300e3fd22" exitCode=0 Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.870130 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e536-account-create-update-bnmp6" event={"ID":"40d08272-d758-4404-aae4-a64897dfbab8","Type":"ContainerDied","Data":"f59d16e21842e09e539e7cb40ecdfd9188979d63816912780cb72eb300e3fd22"} Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.882833 4745 generic.go:334] "Generic (PLEG): container finished" podID="43837717-2618-45c9-afac-172826694ae5" containerID="09ad1c05b19779977f458ccdf6da3b4f7981fb3e1c55e62d2c5d17dbd9fbfe34" exitCode=0 Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.883023 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vhzt5" event={"ID":"43837717-2618-45c9-afac-172826694ae5","Type":"ContainerDied","Data":"09ad1c05b19779977f458ccdf6da3b4f7981fb3e1c55e62d2c5d17dbd9fbfe34"} Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.898472 4745 generic.go:334] "Generic (PLEG): container finished" podID="ea1ee906-8314-4be0-9845-f8abfe129175" containerID="231ca82fbfa5fe8bcf14cb9fe6ec420a3371909539a49a3cc7e48ff962566999" exitCode=0 Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.898769 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d49f-account-create-update-kqsjx" event={"ID":"ea1ee906-8314-4be0-9845-f8abfe129175","Type":"ContainerDied","Data":"231ca82fbfa5fe8bcf14cb9fe6ec420a3371909539a49a3cc7e48ff962566999"} Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.919266 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c3dd1f-4ab9-4d80-b46c-0fc94318b410","Type":"ContainerStarted","Data":"e24562c42d8f0e6b8f7f172d8a23e983f89dc46cf1fa4a946bd8950992629950"} Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.919819 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c3dd1f-4ab9-4d80-b46c-0fc94318b410","Type":"ContainerStarted","Data":"4e7e1b09d6c348939a67aa64c8205ac1b525543663f22776cd4cffbb72995f22"} Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.929490 4745 generic.go:334] "Generic (PLEG): container finished" podID="a358310a-9254-4ca9-865c-4af37de2791a" containerID="0bbc9404782fc19ece5c2c4a2c4e395165e19b288eba5d996aad423d039b8dbe" exitCode=0 Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.929617 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mndrs" event={"ID":"a358310a-9254-4ca9-865c-4af37de2791a","Type":"ContainerDied","Data":"0bbc9404782fc19ece5c2c4a2c4e395165e19b288eba5d996aad423d039b8dbe"} Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.937114 4745 generic.go:334] "Generic (PLEG): container finished" podID="eb863088-6993-45cc-8d6a-bd1a7d6f403a" containerID="490ddfd6330b9b3002fed97b1c7f9089c9bcf9d6419e599757b54a76e8d3e14a" exitCode=0 Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.937184 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qq6pt" event={"ID":"eb863088-6993-45cc-8d6a-bd1a7d6f403a","Type":"ContainerDied","Data":"490ddfd6330b9b3002fed97b1c7f9089c9bcf9d6419e599757b54a76e8d3e14a"} Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.939872 4745 generic.go:334] "Generic (PLEG): container finished" podID="2018b5a5-625c-4595-a26f-1f4a6df2bb90" containerID="697de4a9fded904a811d04612b2ec365c4c14861fd0b1d3d145deddded6bd815" exitCode=0 Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.939936 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-03c2-account-create-update-4546z" event={"ID":"2018b5a5-625c-4595-a26f-1f4a6df2bb90","Type":"ContainerDied","Data":"697de4a9fded904a811d04612b2ec365c4c14861fd0b1d3d145deddded6bd815"} Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.958880 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"76672101-82d9-4d42-b793-6fa33ce5c91a","Type":"ContainerDied","Data":"9aa60e749f6081103894a8da50a852dcf0e5b867eb9d3d6c1623cf71390549f9"} Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.958945 4745 scope.go:117] "RemoveContainer" containerID="a2a411d25b847b9ee2ed11822ecd1b85da7d9379b7d75783aba79718fee4c56a" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.959160 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 11:55:02 crc kubenswrapper[4745]: I1209 11:55:02.993442 4745 scope.go:117] "RemoveContainer" containerID="15f9d17c8d3ff78dd938d1ea15bc17058234ca145574852e5801d6552db2b1ca" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.081160 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.138779 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.198593 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:55:03 crc kubenswrapper[4745]: E1209 11:55:03.199212 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76672101-82d9-4d42-b793-6fa33ce5c91a" containerName="cinder-api-log" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.199241 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="76672101-82d9-4d42-b793-6fa33ce5c91a" containerName="cinder-api-log" Dec 09 11:55:03 crc kubenswrapper[4745]: E1209 11:55:03.199285 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76672101-82d9-4d42-b793-6fa33ce5c91a" containerName="cinder-api" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.199299 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="76672101-82d9-4d42-b793-6fa33ce5c91a" containerName="cinder-api" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.199560 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="76672101-82d9-4d42-b793-6fa33ce5c91a" containerName="cinder-api" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.199593 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="76672101-82d9-4d42-b793-6fa33ce5c91a" containerName="cinder-api-log" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.200914 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.204319 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.204332 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.205082 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.216306 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.310632 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-config-data\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.310695 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48a868ce-c1ab-457a-bd7f-224f8e982a13-etc-machine-id\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.310739 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.310805 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48a868ce-c1ab-457a-bd7f-224f8e982a13-logs\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.310866 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.310932 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxrbr\" (UniqueName: \"kubernetes.io/projected/48a868ce-c1ab-457a-bd7f-224f8e982a13-kube-api-access-cxrbr\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.310956 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-scripts\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.311118 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-public-tls-certs\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.311213 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-config-data-custom\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.413077 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-config-data\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.413127 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48a868ce-c1ab-457a-bd7f-224f8e982a13-etc-machine-id\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.413158 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.413185 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48a868ce-c1ab-457a-bd7f-224f8e982a13-logs\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.413217 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.413242 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxrbr\" (UniqueName: \"kubernetes.io/projected/48a868ce-c1ab-457a-bd7f-224f8e982a13-kube-api-access-cxrbr\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.413261 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-scripts\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.413307 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-public-tls-certs\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.413333 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-config-data-custom\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.414393 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48a868ce-c1ab-457a-bd7f-224f8e982a13-logs\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.414945 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48a868ce-c1ab-457a-bd7f-224f8e982a13-etc-machine-id\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.419685 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-config-data-custom\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.419890 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-config-data\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.420129 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.421046 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-public-tls-certs\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.422319 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.422675 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-scripts\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.451061 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxrbr\" (UniqueName: \"kubernetes.io/projected/48a868ce-c1ab-457a-bd7f-224f8e982a13-kube-api-access-cxrbr\") pod \"cinder-api-0\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.520151 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.570486 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76672101-82d9-4d42-b793-6fa33ce5c91a" path="/var/lib/kubelet/pods/76672101-82d9-4d42-b793-6fa33ce5c91a/volumes" Dec 09 11:55:03 crc kubenswrapper[4745]: I1209 11:55:03.972436 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c3dd1f-4ab9-4d80-b46c-0fc94318b410","Type":"ContainerStarted","Data":"414057ed133b165e5b79e5d8f44e5cec14d6f6f8a08f6b23dfe41839100e2afa"} Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.068554 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.463083 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d49f-account-create-update-kqsjx" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.655729 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea1ee906-8314-4be0-9845-f8abfe129175-operator-scripts\") pod \"ea1ee906-8314-4be0-9845-f8abfe129175\" (UID: \"ea1ee906-8314-4be0-9845-f8abfe129175\") " Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.656306 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56txb\" (UniqueName: \"kubernetes.io/projected/ea1ee906-8314-4be0-9845-f8abfe129175-kube-api-access-56txb\") pod \"ea1ee906-8314-4be0-9845-f8abfe129175\" (UID: \"ea1ee906-8314-4be0-9845-f8abfe129175\") " Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.657541 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vhzt5" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.658601 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea1ee906-8314-4be0-9845-f8abfe129175-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea1ee906-8314-4be0-9845-f8abfe129175" (UID: "ea1ee906-8314-4be0-9845-f8abfe129175"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.663668 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea1ee906-8314-4be0-9845-f8abfe129175-kube-api-access-56txb" (OuterVolumeSpecName: "kube-api-access-56txb") pod "ea1ee906-8314-4be0-9845-f8abfe129175" (UID: "ea1ee906-8314-4be0-9845-f8abfe129175"). InnerVolumeSpecName "kube-api-access-56txb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.672305 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e536-account-create-update-bnmp6" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.694498 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-03c2-account-create-update-4546z" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.760035 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k5xv\" (UniqueName: \"kubernetes.io/projected/43837717-2618-45c9-afac-172826694ae5-kube-api-access-5k5xv\") pod \"43837717-2618-45c9-afac-172826694ae5\" (UID: \"43837717-2618-45c9-afac-172826694ae5\") " Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.760332 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43837717-2618-45c9-afac-172826694ae5-operator-scripts\") pod \"43837717-2618-45c9-afac-172826694ae5\" (UID: \"43837717-2618-45c9-afac-172826694ae5\") " Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.761047 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56txb\" (UniqueName: \"kubernetes.io/projected/ea1ee906-8314-4be0-9845-f8abfe129175-kube-api-access-56txb\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.761069 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea1ee906-8314-4be0-9845-f8abfe129175-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.761530 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43837717-2618-45c9-afac-172826694ae5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43837717-2618-45c9-afac-172826694ae5" (UID: "43837717-2618-45c9-afac-172826694ae5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.763879 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mndrs" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.765902 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43837717-2618-45c9-afac-172826694ae5-kube-api-access-5k5xv" (OuterVolumeSpecName: "kube-api-access-5k5xv") pod "43837717-2618-45c9-afac-172826694ae5" (UID: "43837717-2618-45c9-afac-172826694ae5"). InnerVolumeSpecName "kube-api-access-5k5xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.776573 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qq6pt" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.866463 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a358310a-9254-4ca9-865c-4af37de2791a-operator-scripts\") pod \"a358310a-9254-4ca9-865c-4af37de2791a\" (UID: \"a358310a-9254-4ca9-865c-4af37de2791a\") " Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.866746 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2fm6\" (UniqueName: \"kubernetes.io/projected/2018b5a5-625c-4595-a26f-1f4a6df2bb90-kube-api-access-q2fm6\") pod \"2018b5a5-625c-4595-a26f-1f4a6df2bb90\" (UID: \"2018b5a5-625c-4595-a26f-1f4a6df2bb90\") " Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.866848 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40d08272-d758-4404-aae4-a64897dfbab8-operator-scripts\") pod \"40d08272-d758-4404-aae4-a64897dfbab8\" (UID: \"40d08272-d758-4404-aae4-a64897dfbab8\") " Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.866900 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2018b5a5-625c-4595-a26f-1f4a6df2bb90-operator-scripts\") pod \"2018b5a5-625c-4595-a26f-1f4a6df2bb90\" (UID: \"2018b5a5-625c-4595-a26f-1f4a6df2bb90\") " Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.866998 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t9k4\" (UniqueName: \"kubernetes.io/projected/a358310a-9254-4ca9-865c-4af37de2791a-kube-api-access-4t9k4\") pod \"a358310a-9254-4ca9-865c-4af37de2791a\" (UID: \"a358310a-9254-4ca9-865c-4af37de2791a\") " Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.867034 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd728\" (UniqueName: \"kubernetes.io/projected/40d08272-d758-4404-aae4-a64897dfbab8-kube-api-access-vd728\") pod \"40d08272-d758-4404-aae4-a64897dfbab8\" (UID: \"40d08272-d758-4404-aae4-a64897dfbab8\") " Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.867932 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40d08272-d758-4404-aae4-a64897dfbab8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "40d08272-d758-4404-aae4-a64897dfbab8" (UID: "40d08272-d758-4404-aae4-a64897dfbab8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.868458 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a358310a-9254-4ca9-865c-4af37de2791a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a358310a-9254-4ca9-865c-4af37de2791a" (UID: "a358310a-9254-4ca9-865c-4af37de2791a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.872366 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2018b5a5-625c-4595-a26f-1f4a6df2bb90-kube-api-access-q2fm6" (OuterVolumeSpecName: "kube-api-access-q2fm6") pod "2018b5a5-625c-4595-a26f-1f4a6df2bb90" (UID: "2018b5a5-625c-4595-a26f-1f4a6df2bb90"). InnerVolumeSpecName "kube-api-access-q2fm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.877655 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2018b5a5-625c-4595-a26f-1f4a6df2bb90-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2018b5a5-625c-4595-a26f-1f4a6df2bb90" (UID: "2018b5a5-625c-4595-a26f-1f4a6df2bb90"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.877849 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40d08272-d758-4404-aae4-a64897dfbab8-kube-api-access-vd728" (OuterVolumeSpecName: "kube-api-access-vd728") pod "40d08272-d758-4404-aae4-a64897dfbab8" (UID: "40d08272-d758-4404-aae4-a64897dfbab8"). InnerVolumeSpecName "kube-api-access-vd728". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.885988 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a358310a-9254-4ca9-865c-4af37de2791a-kube-api-access-4t9k4" (OuterVolumeSpecName: "kube-api-access-4t9k4") pod "a358310a-9254-4ca9-865c-4af37de2791a" (UID: "a358310a-9254-4ca9-865c-4af37de2791a"). InnerVolumeSpecName "kube-api-access-4t9k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.887091 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2fm6\" (UniqueName: \"kubernetes.io/projected/2018b5a5-625c-4595-a26f-1f4a6df2bb90-kube-api-access-q2fm6\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.887124 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43837717-2618-45c9-afac-172826694ae5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.887138 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40d08272-d758-4404-aae4-a64897dfbab8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.887149 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2018b5a5-625c-4595-a26f-1f4a6df2bb90-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.887158 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t9k4\" (UniqueName: \"kubernetes.io/projected/a358310a-9254-4ca9-865c-4af37de2791a-kube-api-access-4t9k4\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.887168 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd728\" (UniqueName: \"kubernetes.io/projected/40d08272-d758-4404-aae4-a64897dfbab8-kube-api-access-vd728\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.887179 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a358310a-9254-4ca9-865c-4af37de2791a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.887189 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k5xv\" (UniqueName: \"kubernetes.io/projected/43837717-2618-45c9-afac-172826694ae5-kube-api-access-5k5xv\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.988999 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb863088-6993-45cc-8d6a-bd1a7d6f403a-operator-scripts\") pod \"eb863088-6993-45cc-8d6a-bd1a7d6f403a\" (UID: \"eb863088-6993-45cc-8d6a-bd1a7d6f403a\") " Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.989123 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbpnl\" (UniqueName: \"kubernetes.io/projected/eb863088-6993-45cc-8d6a-bd1a7d6f403a-kube-api-access-gbpnl\") pod \"eb863088-6993-45cc-8d6a-bd1a7d6f403a\" (UID: \"eb863088-6993-45cc-8d6a-bd1a7d6f403a\") " Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.990882 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb863088-6993-45cc-8d6a-bd1a7d6f403a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb863088-6993-45cc-8d6a-bd1a7d6f403a" (UID: "eb863088-6993-45cc-8d6a-bd1a7d6f403a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:55:04 crc kubenswrapper[4745]: I1209 11:55:04.994755 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb863088-6993-45cc-8d6a-bd1a7d6f403a-kube-api-access-gbpnl" (OuterVolumeSpecName: "kube-api-access-gbpnl") pod "eb863088-6993-45cc-8d6a-bd1a7d6f403a" (UID: "eb863088-6993-45cc-8d6a-bd1a7d6f403a"). InnerVolumeSpecName "kube-api-access-gbpnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.003956 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-03c2-account-create-update-4546z" Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.004045 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-03c2-account-create-update-4546z" event={"ID":"2018b5a5-625c-4595-a26f-1f4a6df2bb90","Type":"ContainerDied","Data":"1936d8323d012c85ee90cd17a324bd51defe561fa2a166545d2a5dd2bf21ef09"} Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.004115 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1936d8323d012c85ee90cd17a324bd51defe561fa2a166545d2a5dd2bf21ef09" Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.009630 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e536-account-create-update-bnmp6" event={"ID":"40d08272-d758-4404-aae4-a64897dfbab8","Type":"ContainerDied","Data":"46226b9ffb5988e8219e0525679bbc71592b889cd06eda1d8842f7dc987acbb7"} Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.009687 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46226b9ffb5988e8219e0525679bbc71592b889cd06eda1d8842f7dc987acbb7" Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.009654 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e536-account-create-update-bnmp6" Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.011549 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vhzt5" event={"ID":"43837717-2618-45c9-afac-172826694ae5","Type":"ContainerDied","Data":"fab4eac75902e5206111ebbbaa5ca85e3d3c4daaf0007360160ca30a847a7fb2"} Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.011590 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fab4eac75902e5206111ebbbaa5ca85e3d3c4daaf0007360160ca30a847a7fb2" Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.011640 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vhzt5" Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.029353 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d49f-account-create-update-kqsjx" event={"ID":"ea1ee906-8314-4be0-9845-f8abfe129175","Type":"ContainerDied","Data":"ee17a952629dfbec967a9919d6ae8806d7f786cd8c251e54eac8ba1f8923275e"} Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.029410 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d49f-account-create-update-kqsjx" Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.029402 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee17a952629dfbec967a9919d6ae8806d7f786cd8c251e54eac8ba1f8923275e" Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.038851 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mndrs" event={"ID":"a358310a-9254-4ca9-865c-4af37de2791a","Type":"ContainerDied","Data":"53afa50637f1833a4648bd0a61746e8204a5e2b4d528416690ad18c6961d7a55"} Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.038909 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53afa50637f1833a4648bd0a61746e8204a5e2b4d528416690ad18c6961d7a55" Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.040557 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mndrs" Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.046007 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qq6pt" event={"ID":"eb863088-6993-45cc-8d6a-bd1a7d6f403a","Type":"ContainerDied","Data":"be659393e8f0ce48536bffe9de9c1158ef11bf7b069462ea7649724fcdb8d96d"} Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.046081 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be659393e8f0ce48536bffe9de9c1158ef11bf7b069462ea7649724fcdb8d96d" Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.046187 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qq6pt" Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.058054 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"48a868ce-c1ab-457a-bd7f-224f8e982a13","Type":"ContainerStarted","Data":"c36983d61b2699871ca1ed37f3d5714e01df50b267fc939dd2ffc2d0e03bdef2"} Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.103778 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbpnl\" (UniqueName: \"kubernetes.io/projected/eb863088-6993-45cc-8d6a-bd1a7d6f403a-kube-api-access-gbpnl\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.103827 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb863088-6993-45cc-8d6a-bd1a7d6f403a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.385436 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.385828 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d8d5a02f-3b73-4a98-8b02-4f150574674e" containerName="glance-log" containerID="cri-o://e8813e68ec5c04a5eca84f32495694cece28ddc0c160e62ff127bb4da6d2300f" gracePeriod=30 Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.385871 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d8d5a02f-3b73-4a98-8b02-4f150574674e" containerName="glance-httpd" containerID="cri-o://416bc8400e98805bbceaa9c931f9fca8fd243adefdbab5e6704f409c75c19e64" gracePeriod=30 Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.483174 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:55:05 crc kubenswrapper[4745]: I1209 11:55:05.484005 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:55:06 crc kubenswrapper[4745]: I1209 11:55:06.074165 4745 generic.go:334] "Generic (PLEG): container finished" podID="d8d5a02f-3b73-4a98-8b02-4f150574674e" containerID="e8813e68ec5c04a5eca84f32495694cece28ddc0c160e62ff127bb4da6d2300f" exitCode=143 Dec 09 11:55:06 crc kubenswrapper[4745]: I1209 11:55:06.074217 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d8d5a02f-3b73-4a98-8b02-4f150574674e","Type":"ContainerDied","Data":"e8813e68ec5c04a5eca84f32495694cece28ddc0c160e62ff127bb4da6d2300f"} Dec 09 11:55:06 crc kubenswrapper[4745]: I1209 11:55:06.086633 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerName="ceilometer-central-agent" containerID="cri-o://4e7e1b09d6c348939a67aa64c8205ac1b525543663f22776cd4cffbb72995f22" gracePeriod=30 Dec 09 11:55:06 crc kubenswrapper[4745]: I1209 11:55:06.086744 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c3dd1f-4ab9-4d80-b46c-0fc94318b410","Type":"ContainerStarted","Data":"89e5af4abad5ee0743bcbd81fbe9f9dd9b08ce4780a5efc8fea7eeb97453ad69"} Dec 09 11:55:06 crc kubenswrapper[4745]: I1209 11:55:06.086823 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 11:55:06 crc kubenswrapper[4745]: I1209 11:55:06.086891 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerName="proxy-httpd" containerID="cri-o://89e5af4abad5ee0743bcbd81fbe9f9dd9b08ce4780a5efc8fea7eeb97453ad69" gracePeriod=30 Dec 09 11:55:06 crc kubenswrapper[4745]: I1209 11:55:06.087026 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerName="sg-core" containerID="cri-o://414057ed133b165e5b79e5d8f44e5cec14d6f6f8a08f6b23dfe41839100e2afa" gracePeriod=30 Dec 09 11:55:06 crc kubenswrapper[4745]: I1209 11:55:06.087148 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerName="ceilometer-notification-agent" containerID="cri-o://e24562c42d8f0e6b8f7f172d8a23e983f89dc46cf1fa4a946bd8950992629950" gracePeriod=30 Dec 09 11:55:06 crc kubenswrapper[4745]: I1209 11:55:06.098194 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"48a868ce-c1ab-457a-bd7f-224f8e982a13","Type":"ContainerStarted","Data":"3cb8e987a7a6b5a17dc3a1ce4421d7a97a43c5bcf0226be502cadc9674ba1fdf"} Dec 09 11:55:06 crc kubenswrapper[4745]: I1209 11:55:06.119059 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.177681789 podStartE2EDuration="8.119033926s" podCreationTimestamp="2025-12-09 11:54:58 +0000 UTC" firstStartedPulling="2025-12-09 11:55:00.93543272 +0000 UTC m=+1387.760634254" lastFinishedPulling="2025-12-09 11:55:04.876784867 +0000 UTC m=+1391.701986391" observedRunningTime="2025-12-09 11:55:06.11621477 +0000 UTC m=+1392.941416304" watchObservedRunningTime="2025-12-09 11:55:06.119033926 +0000 UTC m=+1392.944235450" Dec 09 11:55:07 crc kubenswrapper[4745]: I1209 11:55:07.110466 4745 generic.go:334] "Generic (PLEG): container finished" podID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerID="89e5af4abad5ee0743bcbd81fbe9f9dd9b08ce4780a5efc8fea7eeb97453ad69" exitCode=0 Dec 09 11:55:07 crc kubenswrapper[4745]: I1209 11:55:07.110844 4745 generic.go:334] "Generic (PLEG): container finished" podID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerID="414057ed133b165e5b79e5d8f44e5cec14d6f6f8a08f6b23dfe41839100e2afa" exitCode=2 Dec 09 11:55:07 crc kubenswrapper[4745]: I1209 11:55:07.110854 4745 generic.go:334] "Generic (PLEG): container finished" podID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerID="e24562c42d8f0e6b8f7f172d8a23e983f89dc46cf1fa4a946bd8950992629950" exitCode=0 Dec 09 11:55:07 crc kubenswrapper[4745]: I1209 11:55:07.110546 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c3dd1f-4ab9-4d80-b46c-0fc94318b410","Type":"ContainerDied","Data":"89e5af4abad5ee0743bcbd81fbe9f9dd9b08ce4780a5efc8fea7eeb97453ad69"} Dec 09 11:55:07 crc kubenswrapper[4745]: I1209 11:55:07.110908 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c3dd1f-4ab9-4d80-b46c-0fc94318b410","Type":"ContainerDied","Data":"414057ed133b165e5b79e5d8f44e5cec14d6f6f8a08f6b23dfe41839100e2afa"} Dec 09 11:55:07 crc kubenswrapper[4745]: I1209 11:55:07.110927 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c3dd1f-4ab9-4d80-b46c-0fc94318b410","Type":"ContainerDied","Data":"e24562c42d8f0e6b8f7f172d8a23e983f89dc46cf1fa4a946bd8950992629950"} Dec 09 11:55:07 crc kubenswrapper[4745]: I1209 11:55:07.113720 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"48a868ce-c1ab-457a-bd7f-224f8e982a13","Type":"ContainerStarted","Data":"84d0dbf76c2ff2ed4c9dc4cbbf8300c430928bdea45eabd90653242d68969b38"} Dec 09 11:55:07 crc kubenswrapper[4745]: I1209 11:55:07.113994 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 09 11:55:07 crc kubenswrapper[4745]: I1209 11:55:07.145601 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.145573348 podStartE2EDuration="4.145573348s" podCreationTimestamp="2025-12-09 11:55:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:55:07.137428889 +0000 UTC m=+1393.962630423" watchObservedRunningTime="2025-12-09 11:55:07.145573348 +0000 UTC m=+1393.970774872" Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.157096 4745 generic.go:334] "Generic (PLEG): container finished" podID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerID="4e7e1b09d6c348939a67aa64c8205ac1b525543663f22776cd4cffbb72995f22" exitCode=0 Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.157181 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c3dd1f-4ab9-4d80-b46c-0fc94318b410","Type":"ContainerDied","Data":"4e7e1b09d6c348939a67aa64c8205ac1b525543663f22776cd4cffbb72995f22"} Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.491407 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.614438 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-run-httpd\") pod \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.614681 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c667h\" (UniqueName: \"kubernetes.io/projected/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-kube-api-access-c667h\") pod \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.614842 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-log-httpd\") pod \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.614921 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-scripts\") pod \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.614975 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-combined-ca-bundle\") pod \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.615000 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-config-data\") pod \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.615063 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-sg-core-conf-yaml\") pod \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\" (UID: \"81c3dd1f-4ab9-4d80-b46c-0fc94318b410\") " Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.615489 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "81c3dd1f-4ab9-4d80-b46c-0fc94318b410" (UID: "81c3dd1f-4ab9-4d80-b46c-0fc94318b410"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.616114 4745 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.616395 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "81c3dd1f-4ab9-4d80-b46c-0fc94318b410" (UID: "81c3dd1f-4ab9-4d80-b46c-0fc94318b410"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.624144 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-scripts" (OuterVolumeSpecName: "scripts") pod "81c3dd1f-4ab9-4d80-b46c-0fc94318b410" (UID: "81c3dd1f-4ab9-4d80-b46c-0fc94318b410"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.624251 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-kube-api-access-c667h" (OuterVolumeSpecName: "kube-api-access-c667h") pod "81c3dd1f-4ab9-4d80-b46c-0fc94318b410" (UID: "81c3dd1f-4ab9-4d80-b46c-0fc94318b410"). InnerVolumeSpecName "kube-api-access-c667h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.652549 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "81c3dd1f-4ab9-4d80-b46c-0fc94318b410" (UID: "81c3dd1f-4ab9-4d80-b46c-0fc94318b410"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.738287 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c667h\" (UniqueName: \"kubernetes.io/projected/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-kube-api-access-c667h\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.738955 4745 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.738979 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.738991 4745 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.768619 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81c3dd1f-4ab9-4d80-b46c-0fc94318b410" (UID: "81c3dd1f-4ab9-4d80-b46c-0fc94318b410"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.800427 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-config-data" (OuterVolumeSpecName: "config-data") pod "81c3dd1f-4ab9-4d80-b46c-0fc94318b410" (UID: "81c3dd1f-4ab9-4d80-b46c-0fc94318b410"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.841826 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:08 crc kubenswrapper[4745]: I1209 11:55:08.841874 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c3dd1f-4ab9-4d80-b46c-0fc94318b410-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.174703 4745 generic.go:334] "Generic (PLEG): container finished" podID="d8d5a02f-3b73-4a98-8b02-4f150574674e" containerID="416bc8400e98805bbceaa9c931f9fca8fd243adefdbab5e6704f409c75c19e64" exitCode=0 Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.174898 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d8d5a02f-3b73-4a98-8b02-4f150574674e","Type":"ContainerDied","Data":"416bc8400e98805bbceaa9c931f9fca8fd243adefdbab5e6704f409c75c19e64"} Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.175076 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d8d5a02f-3b73-4a98-8b02-4f150574674e","Type":"ContainerDied","Data":"02d10a5871ed8b2ebf24618c25900205b3f155855cd4743dd778b487cdcadee8"} Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.175102 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02d10a5871ed8b2ebf24618c25900205b3f155855cd4743dd778b487cdcadee8" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.182730 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c3dd1f-4ab9-4d80-b46c-0fc94318b410","Type":"ContainerDied","Data":"3a51605f3d4db0df92908e29b976943e955add46b97450c4e60eccd50cdc7c14"} Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.182819 4745 scope.go:117] "RemoveContainer" containerID="89e5af4abad5ee0743bcbd81fbe9f9dd9b08ce4780a5efc8fea7eeb97453ad69" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.182861 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.196091 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.219151 4745 scope.go:117] "RemoveContainer" containerID="414057ed133b165e5b79e5d8f44e5cec14d6f6f8a08f6b23dfe41839100e2afa" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.252448 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d8d5a02f-3b73-4a98-8b02-4f150574674e-httpd-run\") pod \"d8d5a02f-3b73-4a98-8b02-4f150574674e\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.252595 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nlbk\" (UniqueName: \"kubernetes.io/projected/d8d5a02f-3b73-4a98-8b02-4f150574674e-kube-api-access-7nlbk\") pod \"d8d5a02f-3b73-4a98-8b02-4f150574674e\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.252688 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-scripts\") pod \"d8d5a02f-3b73-4a98-8b02-4f150574674e\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.252705 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8d5a02f-3b73-4a98-8b02-4f150574674e-logs\") pod \"d8d5a02f-3b73-4a98-8b02-4f150574674e\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.252762 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"d8d5a02f-3b73-4a98-8b02-4f150574674e\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.252792 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-config-data\") pod \"d8d5a02f-3b73-4a98-8b02-4f150574674e\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.252853 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-combined-ca-bundle\") pod \"d8d5a02f-3b73-4a98-8b02-4f150574674e\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.252964 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-public-tls-certs\") pod \"d8d5a02f-3b73-4a98-8b02-4f150574674e\" (UID: \"d8d5a02f-3b73-4a98-8b02-4f150574674e\") " Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.253951 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8d5a02f-3b73-4a98-8b02-4f150574674e-logs" (OuterVolumeSpecName: "logs") pod "d8d5a02f-3b73-4a98-8b02-4f150574674e" (UID: "d8d5a02f-3b73-4a98-8b02-4f150574674e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.254305 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8d5a02f-3b73-4a98-8b02-4f150574674e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d8d5a02f-3b73-4a98-8b02-4f150574674e" (UID: "d8d5a02f-3b73-4a98-8b02-4f150574674e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.260200 4745 scope.go:117] "RemoveContainer" containerID="e24562c42d8f0e6b8f7f172d8a23e983f89dc46cf1fa4a946bd8950992629950" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.271858 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8d5a02f-3b73-4a98-8b02-4f150574674e-kube-api-access-7nlbk" (OuterVolumeSpecName: "kube-api-access-7nlbk") pod "d8d5a02f-3b73-4a98-8b02-4f150574674e" (UID: "d8d5a02f-3b73-4a98-8b02-4f150574674e"). InnerVolumeSpecName "kube-api-access-7nlbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.287675 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "d8d5a02f-3b73-4a98-8b02-4f150574674e" (UID: "d8d5a02f-3b73-4a98-8b02-4f150574674e"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.319768 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-scripts" (OuterVolumeSpecName: "scripts") pod "d8d5a02f-3b73-4a98-8b02-4f150574674e" (UID: "d8d5a02f-3b73-4a98-8b02-4f150574674e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.333594 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.356564 4745 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.356866 4745 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d8d5a02f-3b73-4a98-8b02-4f150574674e-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.357022 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nlbk\" (UniqueName: \"kubernetes.io/projected/d8d5a02f-3b73-4a98-8b02-4f150574674e-kube-api-access-7nlbk\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.357108 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.357204 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8d5a02f-3b73-4a98-8b02-4f150574674e-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.372654 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.409447 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:09 crc kubenswrapper[4745]: E1209 11:55:09.409954 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8d5a02f-3b73-4a98-8b02-4f150574674e" containerName="glance-log" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.409968 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8d5a02f-3b73-4a98-8b02-4f150574674e" containerName="glance-log" Dec 09 11:55:09 crc kubenswrapper[4745]: E1209 11:55:09.409982 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2018b5a5-625c-4595-a26f-1f4a6df2bb90" containerName="mariadb-account-create-update" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.410012 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2018b5a5-625c-4595-a26f-1f4a6df2bb90" containerName="mariadb-account-create-update" Dec 09 11:55:09 crc kubenswrapper[4745]: E1209 11:55:09.410025 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb863088-6993-45cc-8d6a-bd1a7d6f403a" containerName="mariadb-database-create" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.410032 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb863088-6993-45cc-8d6a-bd1a7d6f403a" containerName="mariadb-database-create" Dec 09 11:55:09 crc kubenswrapper[4745]: E1209 11:55:09.410047 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1ee906-8314-4be0-9845-f8abfe129175" containerName="mariadb-account-create-update" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.410053 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1ee906-8314-4be0-9845-f8abfe129175" containerName="mariadb-account-create-update" Dec 09 11:55:09 crc kubenswrapper[4745]: E1209 11:55:09.410071 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerName="sg-core" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.410079 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerName="sg-core" Dec 09 11:55:09 crc kubenswrapper[4745]: E1209 11:55:09.410110 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43837717-2618-45c9-afac-172826694ae5" containerName="mariadb-database-create" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.410118 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="43837717-2618-45c9-afac-172826694ae5" containerName="mariadb-database-create" Dec 09 11:55:09 crc kubenswrapper[4745]: E1209 11:55:09.410132 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerName="proxy-httpd" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.410138 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerName="proxy-httpd" Dec 09 11:55:09 crc kubenswrapper[4745]: E1209 11:55:09.410150 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a358310a-9254-4ca9-865c-4af37de2791a" containerName="mariadb-database-create" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.410156 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="a358310a-9254-4ca9-865c-4af37de2791a" containerName="mariadb-database-create" Dec 09 11:55:09 crc kubenswrapper[4745]: E1209 11:55:09.410166 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40d08272-d758-4404-aae4-a64897dfbab8" containerName="mariadb-account-create-update" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.410172 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="40d08272-d758-4404-aae4-a64897dfbab8" containerName="mariadb-account-create-update" Dec 09 11:55:09 crc kubenswrapper[4745]: E1209 11:55:09.410182 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerName="ceilometer-notification-agent" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.410188 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerName="ceilometer-notification-agent" Dec 09 11:55:09 crc kubenswrapper[4745]: E1209 11:55:09.410195 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8d5a02f-3b73-4a98-8b02-4f150574674e" containerName="glance-httpd" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.410202 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8d5a02f-3b73-4a98-8b02-4f150574674e" containerName="glance-httpd" Dec 09 11:55:09 crc kubenswrapper[4745]: E1209 11:55:09.410216 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerName="ceilometer-central-agent" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.410221 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerName="ceilometer-central-agent" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.411017 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerName="ceilometer-notification-agent" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.411038 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8d5a02f-3b73-4a98-8b02-4f150574674e" containerName="glance-httpd" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.411052 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="40d08272-d758-4404-aae4-a64897dfbab8" containerName="mariadb-account-create-update" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.411059 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="2018b5a5-625c-4595-a26f-1f4a6df2bb90" containerName="mariadb-account-create-update" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.411066 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="43837717-2618-45c9-afac-172826694ae5" containerName="mariadb-database-create" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.411077 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerName="ceilometer-central-agent" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.411090 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="a358310a-9254-4ca9-865c-4af37de2791a" containerName="mariadb-database-create" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.411098 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerName="sg-core" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.411109 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" containerName="proxy-httpd" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.411123 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1ee906-8314-4be0-9845-f8abfe129175" containerName="mariadb-account-create-update" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.411133 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb863088-6993-45cc-8d6a-bd1a7d6f403a" containerName="mariadb-database-create" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.411141 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8d5a02f-3b73-4a98-8b02-4f150574674e" containerName="glance-log" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.413196 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.420918 4745 scope.go:117] "RemoveContainer" containerID="4e7e1b09d6c348939a67aa64c8205ac1b525543663f22776cd4cffbb72995f22" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.429694 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.429989 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.431389 4745 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.434906 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.454418 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8d5a02f-3b73-4a98-8b02-4f150574674e" (UID: "d8d5a02f-3b73-4a98-8b02-4f150574674e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.465698 4745 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.469651 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.473191 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d8d5a02f-3b73-4a98-8b02-4f150574674e" (UID: "d8d5a02f-3b73-4a98-8b02-4f150574674e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.473287 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-config-data" (OuterVolumeSpecName: "config-data") pod "d8d5a02f-3b73-4a98-8b02-4f150574674e" (UID: "d8d5a02f-3b73-4a98-8b02-4f150574674e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.516318 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qqbmx"] Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.519269 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qqbmx" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.530074 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kbcl6" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.530323 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.530467 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.547535 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qqbmx"] Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.576477 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-config-data\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.576555 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt24f\" (UniqueName: \"kubernetes.io/projected/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-kube-api-access-zt24f\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.576587 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-scripts\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.576646 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-run-httpd\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.576682 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.576721 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-log-httpd\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.576746 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.576853 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.576867 4745 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d5a02f-3b73-4a98-8b02-4f150574674e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.576870 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81c3dd1f-4ab9-4d80-b46c-0fc94318b410" path="/var/lib/kubelet/pods/81c3dd1f-4ab9-4d80-b46c-0fc94318b410/volumes" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.679170 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-config-data\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.679222 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt24f\" (UniqueName: \"kubernetes.io/projected/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-kube-api-access-zt24f\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.679257 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-scripts\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.679291 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-config-data\") pod \"nova-cell0-conductor-db-sync-qqbmx\" (UID: \"ae7f8892-3148-4488-bfa2-afe44917e31d\") " pod="openstack/nova-cell0-conductor-db-sync-qqbmx" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.679315 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc44p\" (UniqueName: \"kubernetes.io/projected/ae7f8892-3148-4488-bfa2-afe44917e31d-kube-api-access-dc44p\") pod \"nova-cell0-conductor-db-sync-qqbmx\" (UID: \"ae7f8892-3148-4488-bfa2-afe44917e31d\") " pod="openstack/nova-cell0-conductor-db-sync-qqbmx" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.679348 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-run-httpd\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.679379 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.679435 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-log-httpd\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.679481 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.679504 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qqbmx\" (UID: \"ae7f8892-3148-4488-bfa2-afe44917e31d\") " pod="openstack/nova-cell0-conductor-db-sync-qqbmx" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.679554 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-scripts\") pod \"nova-cell0-conductor-db-sync-qqbmx\" (UID: \"ae7f8892-3148-4488-bfa2-afe44917e31d\") " pod="openstack/nova-cell0-conductor-db-sync-qqbmx" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.680073 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-run-httpd\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.681694 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-log-httpd\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.686204 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-scripts\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.686217 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.687769 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.688201 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-config-data\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.704451 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt24f\" (UniqueName: \"kubernetes.io/projected/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-kube-api-access-zt24f\") pod \"ceilometer-0\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.775116 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.782266 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-config-data\") pod \"nova-cell0-conductor-db-sync-qqbmx\" (UID: \"ae7f8892-3148-4488-bfa2-afe44917e31d\") " pod="openstack/nova-cell0-conductor-db-sync-qqbmx" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.783267 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc44p\" (UniqueName: \"kubernetes.io/projected/ae7f8892-3148-4488-bfa2-afe44917e31d-kube-api-access-dc44p\") pod \"nova-cell0-conductor-db-sync-qqbmx\" (UID: \"ae7f8892-3148-4488-bfa2-afe44917e31d\") " pod="openstack/nova-cell0-conductor-db-sync-qqbmx" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.783419 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qqbmx\" (UID: \"ae7f8892-3148-4488-bfa2-afe44917e31d\") " pod="openstack/nova-cell0-conductor-db-sync-qqbmx" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.783466 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-scripts\") pod \"nova-cell0-conductor-db-sync-qqbmx\" (UID: \"ae7f8892-3148-4488-bfa2-afe44917e31d\") " pod="openstack/nova-cell0-conductor-db-sync-qqbmx" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.786898 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-config-data\") pod \"nova-cell0-conductor-db-sync-qqbmx\" (UID: \"ae7f8892-3148-4488-bfa2-afe44917e31d\") " pod="openstack/nova-cell0-conductor-db-sync-qqbmx" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.787246 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qqbmx\" (UID: \"ae7f8892-3148-4488-bfa2-afe44917e31d\") " pod="openstack/nova-cell0-conductor-db-sync-qqbmx" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.790868 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-scripts\") pod \"nova-cell0-conductor-db-sync-qqbmx\" (UID: \"ae7f8892-3148-4488-bfa2-afe44917e31d\") " pod="openstack/nova-cell0-conductor-db-sync-qqbmx" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.812457 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc44p\" (UniqueName: \"kubernetes.io/projected/ae7f8892-3148-4488-bfa2-afe44917e31d-kube-api-access-dc44p\") pod \"nova-cell0-conductor-db-sync-qqbmx\" (UID: \"ae7f8892-3148-4488-bfa2-afe44917e31d\") " pod="openstack/nova-cell0-conductor-db-sync-qqbmx" Dec 09 11:55:09 crc kubenswrapper[4745]: I1209 11:55:09.847616 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qqbmx" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.049427 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9294p"] Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.056532 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.076966 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9294p"] Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.203098 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn8mg\" (UniqueName: \"kubernetes.io/projected/ca574c56-3920-4a4b-be3b-edbbad9fbb26-kube-api-access-wn8mg\") pod \"redhat-operators-9294p\" (UID: \"ca574c56-3920-4a4b-be3b-edbbad9fbb26\") " pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.203699 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca574c56-3920-4a4b-be3b-edbbad9fbb26-catalog-content\") pod \"redhat-operators-9294p\" (UID: \"ca574c56-3920-4a4b-be3b-edbbad9fbb26\") " pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.203755 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca574c56-3920-4a4b-be3b-edbbad9fbb26-utilities\") pod \"redhat-operators-9294p\" (UID: \"ca574c56-3920-4a4b-be3b-edbbad9fbb26\") " pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.206429 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.267918 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.294440 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.302599 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.305819 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn8mg\" (UniqueName: \"kubernetes.io/projected/ca574c56-3920-4a4b-be3b-edbbad9fbb26-kube-api-access-wn8mg\") pod \"redhat-operators-9294p\" (UID: \"ca574c56-3920-4a4b-be3b-edbbad9fbb26\") " pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.305921 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca574c56-3920-4a4b-be3b-edbbad9fbb26-catalog-content\") pod \"redhat-operators-9294p\" (UID: \"ca574c56-3920-4a4b-be3b-edbbad9fbb26\") " pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.305982 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca574c56-3920-4a4b-be3b-edbbad9fbb26-utilities\") pod \"redhat-operators-9294p\" (UID: \"ca574c56-3920-4a4b-be3b-edbbad9fbb26\") " pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.306614 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca574c56-3920-4a4b-be3b-edbbad9fbb26-utilities\") pod \"redhat-operators-9294p\" (UID: \"ca574c56-3920-4a4b-be3b-edbbad9fbb26\") " pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.307045 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca574c56-3920-4a4b-be3b-edbbad9fbb26-catalog-content\") pod \"redhat-operators-9294p\" (UID: \"ca574c56-3920-4a4b-be3b-edbbad9fbb26\") " pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.330322 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.332282 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.337103 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.337204 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.349265 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn8mg\" (UniqueName: \"kubernetes.io/projected/ca574c56-3920-4a4b-be3b-edbbad9fbb26-kube-api-access-wn8mg\") pod \"redhat-operators-9294p\" (UID: \"ca574c56-3920-4a4b-be3b-edbbad9fbb26\") " pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.350194 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.402254 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.409939 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8pnj\" (UniqueName: \"kubernetes.io/projected/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-kube-api-access-z8pnj\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.410079 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-logs\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.410337 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-scripts\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.410401 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.410469 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.410530 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.410562 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-config-data\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.410590 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.515496 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-logs\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.515630 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-scripts\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.515711 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.515794 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.515846 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.515878 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-config-data\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.515904 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.519173 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-logs\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.519316 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8pnj\" (UniqueName: \"kubernetes.io/projected/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-kube-api-access-z8pnj\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.520718 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.521364 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.522447 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-scripts\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.527759 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-config-data\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.534241 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.536378 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.548925 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8pnj\" (UniqueName: \"kubernetes.io/projected/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-kube-api-access-z8pnj\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.611829 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qqbmx"] Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.631220 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.704310 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:55:10 crc kubenswrapper[4745]: I1209 11:55:10.810413 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9294p"] Dec 09 11:55:10 crc kubenswrapper[4745]: W1209 11:55:10.818326 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca574c56_3920_4a4b_be3b_edbbad9fbb26.slice/crio-71a56f57d045255e19e61a23950102de6792921a6804f4f9005a02cf2b2a915c WatchSource:0}: Error finding container 71a56f57d045255e19e61a23950102de6792921a6804f4f9005a02cf2b2a915c: Status 404 returned error can't find the container with id 71a56f57d045255e19e61a23950102de6792921a6804f4f9005a02cf2b2a915c Dec 09 11:55:11 crc kubenswrapper[4745]: I1209 11:55:11.230152 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef","Type":"ContainerStarted","Data":"0d6c154453bce530427a527aa857fda96a96cb6f1dab40f15fb99cb39e935409"} Dec 09 11:55:11 crc kubenswrapper[4745]: I1209 11:55:11.230819 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef","Type":"ContainerStarted","Data":"4a7fe3e416e43f0fa9a4240c4f4f825d0d0307961cf41ff65eff84023ac68bff"} Dec 09 11:55:11 crc kubenswrapper[4745]: I1209 11:55:11.232248 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qqbmx" event={"ID":"ae7f8892-3148-4488-bfa2-afe44917e31d","Type":"ContainerStarted","Data":"a2bb5623605328c8b2e49b242e6d09ac8d147302e5702fb01233374cd9963f79"} Dec 09 11:55:11 crc kubenswrapper[4745]: I1209 11:55:11.239565 4745 generic.go:334] "Generic (PLEG): container finished" podID="ca574c56-3920-4a4b-be3b-edbbad9fbb26" containerID="5d58d6320e5649e6f6dea93f2204e8676438d15acfbce7be4a0b585d589311ce" exitCode=0 Dec 09 11:55:11 crc kubenswrapper[4745]: I1209 11:55:11.239605 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9294p" event={"ID":"ca574c56-3920-4a4b-be3b-edbbad9fbb26","Type":"ContainerDied","Data":"5d58d6320e5649e6f6dea93f2204e8676438d15acfbce7be4a0b585d589311ce"} Dec 09 11:55:11 crc kubenswrapper[4745]: I1209 11:55:11.239625 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9294p" event={"ID":"ca574c56-3920-4a4b-be3b-edbbad9fbb26","Type":"ContainerStarted","Data":"71a56f57d045255e19e61a23950102de6792921a6804f4f9005a02cf2b2a915c"} Dec 09 11:55:11 crc kubenswrapper[4745]: I1209 11:55:11.585323 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8d5a02f-3b73-4a98-8b02-4f150574674e" path="/var/lib/kubelet/pods/d8d5a02f-3b73-4a98-8b02-4f150574674e/volumes" Dec 09 11:55:11 crc kubenswrapper[4745]: I1209 11:55:11.588311 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:55:11 crc kubenswrapper[4745]: I1209 11:55:11.620483 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:55:11 crc kubenswrapper[4745]: I1209 11:55:11.620794 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4466c464-f51f-4227-94fc-e216c99fe969" containerName="glance-log" containerID="cri-o://dcd1005a8885bb5693574d21b05dd46594f5fdbacb6179fff6af636199980b0c" gracePeriod=30 Dec 09 11:55:11 crc kubenswrapper[4745]: I1209 11:55:11.621214 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4466c464-f51f-4227-94fc-e216c99fe969" containerName="glance-httpd" containerID="cri-o://5fe186d0b5f18fa35433ed92b6dadb74514c3896f1a7619b4b74694f5b3b2775" gracePeriod=30 Dec 09 11:55:12 crc kubenswrapper[4745]: I1209 11:55:12.312113 4745 generic.go:334] "Generic (PLEG): container finished" podID="4466c464-f51f-4227-94fc-e216c99fe969" containerID="dcd1005a8885bb5693574d21b05dd46594f5fdbacb6179fff6af636199980b0c" exitCode=143 Dec 09 11:55:12 crc kubenswrapper[4745]: I1209 11:55:12.313452 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4466c464-f51f-4227-94fc-e216c99fe969","Type":"ContainerDied","Data":"dcd1005a8885bb5693574d21b05dd46594f5fdbacb6179fff6af636199980b0c"} Dec 09 11:55:12 crc kubenswrapper[4745]: I1209 11:55:12.321126 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef","Type":"ContainerStarted","Data":"a42b3f735f3b95ffb7233c7ebfab52bb67652b0981f03c484d766fee5392b7c6"} Dec 09 11:55:12 crc kubenswrapper[4745]: I1209 11:55:12.327840 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73ed8df4-e28f-4c76-bca1-a3d77ef789d4","Type":"ContainerStarted","Data":"3c85021515548f94941fc14712c22b63b3fc3882063840e3d2b2b60e120adcac"} Dec 09 11:55:13 crc kubenswrapper[4745]: I1209 11:55:13.368990 4745 generic.go:334] "Generic (PLEG): container finished" podID="ca574c56-3920-4a4b-be3b-edbbad9fbb26" containerID="5529eac0034a1795998b009f65314009d8bff019c6e01a4678a2044728b409b3" exitCode=0 Dec 09 11:55:13 crc kubenswrapper[4745]: I1209 11:55:13.369665 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9294p" event={"ID":"ca574c56-3920-4a4b-be3b-edbbad9fbb26","Type":"ContainerDied","Data":"5529eac0034a1795998b009f65314009d8bff019c6e01a4678a2044728b409b3"} Dec 09 11:55:13 crc kubenswrapper[4745]: I1209 11:55:13.379261 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef","Type":"ContainerStarted","Data":"2aab079900e20f077aec274299cf36a213894e204aae63b260e81da4fa474aa3"} Dec 09 11:55:13 crc kubenswrapper[4745]: I1209 11:55:13.382420 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73ed8df4-e28f-4c76-bca1-a3d77ef789d4","Type":"ContainerStarted","Data":"892538c48ca05829a077bcf325c90ee4bb55781acc43871ab0ffe0358d7af1b9"} Dec 09 11:55:13 crc kubenswrapper[4745]: I1209 11:55:13.958534 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:14 crc kubenswrapper[4745]: I1209 11:55:14.410128 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9294p" event={"ID":"ca574c56-3920-4a4b-be3b-edbbad9fbb26","Type":"ContainerStarted","Data":"eec6863a2c14d88afc4d631e07dd5f850f64234f44bd97ec2a49470a95dc9eb3"} Dec 09 11:55:14 crc kubenswrapper[4745]: I1209 11:55:14.417966 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73ed8df4-e28f-4c76-bca1-a3d77ef789d4","Type":"ContainerStarted","Data":"a9d2e5091d7aefc5b0d929913437d6ded33f50f234ba6c80915715daf68d74db"} Dec 09 11:55:14 crc kubenswrapper[4745]: I1209 11:55:14.467551 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9294p" podStartSLOduration=1.870027902 podStartE2EDuration="4.467529364s" podCreationTimestamp="2025-12-09 11:55:10 +0000 UTC" firstStartedPulling="2025-12-09 11:55:11.242241457 +0000 UTC m=+1398.067442981" lastFinishedPulling="2025-12-09 11:55:13.839742919 +0000 UTC m=+1400.664944443" observedRunningTime="2025-12-09 11:55:14.460896325 +0000 UTC m=+1401.286097839" watchObservedRunningTime="2025-12-09 11:55:14.467529364 +0000 UTC m=+1401.292730888" Dec 09 11:55:14 crc kubenswrapper[4745]: I1209 11:55:14.499517 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.499469614 podStartE2EDuration="4.499469614s" podCreationTimestamp="2025-12-09 11:55:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:55:14.489920007 +0000 UTC m=+1401.315121531" watchObservedRunningTime="2025-12-09 11:55:14.499469614 +0000 UTC m=+1401.324671138" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.402118 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.460943 4745 generic.go:334] "Generic (PLEG): container finished" podID="4466c464-f51f-4227-94fc-e216c99fe969" containerID="5fe186d0b5f18fa35433ed92b6dadb74514c3896f1a7619b4b74694f5b3b2775" exitCode=0 Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.461086 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4466c464-f51f-4227-94fc-e216c99fe969","Type":"ContainerDied","Data":"5fe186d0b5f18fa35433ed92b6dadb74514c3896f1a7619b4b74694f5b3b2775"} Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.461148 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4466c464-f51f-4227-94fc-e216c99fe969","Type":"ContainerDied","Data":"95ef1d24e53155337ca1b234cf7e8a3d22309ee1aef2f7559b0bb0c49a7e7268"} Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.461168 4745 scope.go:117] "RemoveContainer" containerID="5fe186d0b5f18fa35433ed92b6dadb74514c3896f1a7619b4b74694f5b3b2775" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.461162 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.486657 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerName="ceilometer-central-agent" containerID="cri-o://0d6c154453bce530427a527aa857fda96a96cb6f1dab40f15fb99cb39e935409" gracePeriod=30 Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.486874 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef","Type":"ContainerStarted","Data":"deb421994406caafb9675151a6a79a725d8af8d88f67cda7cc025084958f2dfb"} Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.487134 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.487479 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerName="proxy-httpd" containerID="cri-o://deb421994406caafb9675151a6a79a725d8af8d88f67cda7cc025084958f2dfb" gracePeriod=30 Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.487549 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerName="sg-core" containerID="cri-o://2aab079900e20f077aec274299cf36a213894e204aae63b260e81da4fa474aa3" gracePeriod=30 Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.487586 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerName="ceilometer-notification-agent" containerID="cri-o://a42b3f735f3b95ffb7233c7ebfab52bb67652b0981f03c484d766fee5392b7c6" gracePeriod=30 Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.516789 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.522600157 podStartE2EDuration="6.516763776s" podCreationTimestamp="2025-12-09 11:55:09 +0000 UTC" firstStartedPulling="2025-12-09 11:55:10.305275628 +0000 UTC m=+1397.130477152" lastFinishedPulling="2025-12-09 11:55:14.299439247 +0000 UTC m=+1401.124640771" observedRunningTime="2025-12-09 11:55:15.512276065 +0000 UTC m=+1402.337477589" watchObservedRunningTime="2025-12-09 11:55:15.516763776 +0000 UTC m=+1402.341965300" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.539006 4745 scope.go:117] "RemoveContainer" containerID="dcd1005a8885bb5693574d21b05dd46594f5fdbacb6179fff6af636199980b0c" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.576237 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-internal-tls-certs\") pod \"4466c464-f51f-4227-94fc-e216c99fe969\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.578037 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-scripts\") pod \"4466c464-f51f-4227-94fc-e216c99fe969\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.578266 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"4466c464-f51f-4227-94fc-e216c99fe969\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.578499 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-combined-ca-bundle\") pod \"4466c464-f51f-4227-94fc-e216c99fe969\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.578688 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-config-data\") pod \"4466c464-f51f-4227-94fc-e216c99fe969\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.578786 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76p49\" (UniqueName: \"kubernetes.io/projected/4466c464-f51f-4227-94fc-e216c99fe969-kube-api-access-76p49\") pod \"4466c464-f51f-4227-94fc-e216c99fe969\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.578898 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4466c464-f51f-4227-94fc-e216c99fe969-logs\") pod \"4466c464-f51f-4227-94fc-e216c99fe969\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.579042 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4466c464-f51f-4227-94fc-e216c99fe969-httpd-run\") pod \"4466c464-f51f-4227-94fc-e216c99fe969\" (UID: \"4466c464-f51f-4227-94fc-e216c99fe969\") " Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.580415 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4466c464-f51f-4227-94fc-e216c99fe969-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4466c464-f51f-4227-94fc-e216c99fe969" (UID: "4466c464-f51f-4227-94fc-e216c99fe969"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.591475 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4466c464-f51f-4227-94fc-e216c99fe969-logs" (OuterVolumeSpecName: "logs") pod "4466c464-f51f-4227-94fc-e216c99fe969" (UID: "4466c464-f51f-4227-94fc-e216c99fe969"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.599384 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4466c464-f51f-4227-94fc-e216c99fe969-kube-api-access-76p49" (OuterVolumeSpecName: "kube-api-access-76p49") pod "4466c464-f51f-4227-94fc-e216c99fe969" (UID: "4466c464-f51f-4227-94fc-e216c99fe969"). InnerVolumeSpecName "kube-api-access-76p49". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.601052 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-scripts" (OuterVolumeSpecName: "scripts") pod "4466c464-f51f-4227-94fc-e216c99fe969" (UID: "4466c464-f51f-4227-94fc-e216c99fe969"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.612672 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "4466c464-f51f-4227-94fc-e216c99fe969" (UID: "4466c464-f51f-4227-94fc-e216c99fe969"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.635324 4745 scope.go:117] "RemoveContainer" containerID="5fe186d0b5f18fa35433ed92b6dadb74514c3896f1a7619b4b74694f5b3b2775" Dec 09 11:55:15 crc kubenswrapper[4745]: E1209 11:55:15.645271 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fe186d0b5f18fa35433ed92b6dadb74514c3896f1a7619b4b74694f5b3b2775\": container with ID starting with 5fe186d0b5f18fa35433ed92b6dadb74514c3896f1a7619b4b74694f5b3b2775 not found: ID does not exist" containerID="5fe186d0b5f18fa35433ed92b6dadb74514c3896f1a7619b4b74694f5b3b2775" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.645325 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fe186d0b5f18fa35433ed92b6dadb74514c3896f1a7619b4b74694f5b3b2775"} err="failed to get container status \"5fe186d0b5f18fa35433ed92b6dadb74514c3896f1a7619b4b74694f5b3b2775\": rpc error: code = NotFound desc = could not find container \"5fe186d0b5f18fa35433ed92b6dadb74514c3896f1a7619b4b74694f5b3b2775\": container with ID starting with 5fe186d0b5f18fa35433ed92b6dadb74514c3896f1a7619b4b74694f5b3b2775 not found: ID does not exist" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.645360 4745 scope.go:117] "RemoveContainer" containerID="dcd1005a8885bb5693574d21b05dd46594f5fdbacb6179fff6af636199980b0c" Dec 09 11:55:15 crc kubenswrapper[4745]: E1209 11:55:15.645896 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcd1005a8885bb5693574d21b05dd46594f5fdbacb6179fff6af636199980b0c\": container with ID starting with dcd1005a8885bb5693574d21b05dd46594f5fdbacb6179fff6af636199980b0c not found: ID does not exist" containerID="dcd1005a8885bb5693574d21b05dd46594f5fdbacb6179fff6af636199980b0c" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.645954 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd1005a8885bb5693574d21b05dd46594f5fdbacb6179fff6af636199980b0c"} err="failed to get container status \"dcd1005a8885bb5693574d21b05dd46594f5fdbacb6179fff6af636199980b0c\": rpc error: code = NotFound desc = could not find container \"dcd1005a8885bb5693574d21b05dd46594f5fdbacb6179fff6af636199980b0c\": container with ID starting with dcd1005a8885bb5693574d21b05dd46594f5fdbacb6179fff6af636199980b0c not found: ID does not exist" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.651185 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4466c464-f51f-4227-94fc-e216c99fe969" (UID: "4466c464-f51f-4227-94fc-e216c99fe969"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.682998 4745 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.683344 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.683362 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76p49\" (UniqueName: \"kubernetes.io/projected/4466c464-f51f-4227-94fc-e216c99fe969-kube-api-access-76p49\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.683371 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4466c464-f51f-4227-94fc-e216c99fe969-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.683382 4745 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4466c464-f51f-4227-94fc-e216c99fe969-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.683389 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.717559 4745 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.717866 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4466c464-f51f-4227-94fc-e216c99fe969" (UID: "4466c464-f51f-4227-94fc-e216c99fe969"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.718780 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-config-data" (OuterVolumeSpecName: "config-data") pod "4466c464-f51f-4227-94fc-e216c99fe969" (UID: "4466c464-f51f-4227-94fc-e216c99fe969"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.790113 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.790159 4745 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4466c464-f51f-4227-94fc-e216c99fe969-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.790173 4745 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.841095 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.858873 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.911796 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:55:15 crc kubenswrapper[4745]: E1209 11:55:15.912782 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4466c464-f51f-4227-94fc-e216c99fe969" containerName="glance-httpd" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.912801 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4466c464-f51f-4227-94fc-e216c99fe969" containerName="glance-httpd" Dec 09 11:55:15 crc kubenswrapper[4745]: E1209 11:55:15.912854 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4466c464-f51f-4227-94fc-e216c99fe969" containerName="glance-log" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.912860 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4466c464-f51f-4227-94fc-e216c99fe969" containerName="glance-log" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.913251 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="4466c464-f51f-4227-94fc-e216c99fe969" containerName="glance-httpd" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.913269 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="4466c464-f51f-4227-94fc-e216c99fe969" containerName="glance-log" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.919721 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.925853 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.929806 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 09 11:55:15 crc kubenswrapper[4745]: I1209 11:55:15.946368 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.102055 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.102167 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.102236 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.102266 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztgzt\" (UniqueName: \"kubernetes.io/projected/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-kube-api-access-ztgzt\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.102303 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-logs\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.102339 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.102388 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.102429 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.208890 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.209126 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.209279 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.209320 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztgzt\" (UniqueName: \"kubernetes.io/projected/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-kube-api-access-ztgzt\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.209369 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-logs\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.209438 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.209519 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.209588 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.209611 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.209672 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.210567 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-logs\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.219414 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.220191 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.220916 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.234359 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztgzt\" (UniqueName: \"kubernetes.io/projected/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-kube-api-access-ztgzt\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.234687 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.251114 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.502474 4745 generic.go:334] "Generic (PLEG): container finished" podID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerID="deb421994406caafb9675151a6a79a725d8af8d88f67cda7cc025084958f2dfb" exitCode=0 Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.502955 4745 generic.go:334] "Generic (PLEG): container finished" podID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerID="2aab079900e20f077aec274299cf36a213894e204aae63b260e81da4fa474aa3" exitCode=2 Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.502559 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef","Type":"ContainerDied","Data":"deb421994406caafb9675151a6a79a725d8af8d88f67cda7cc025084958f2dfb"} Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.502970 4745 generic.go:334] "Generic (PLEG): container finished" podID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerID="a42b3f735f3b95ffb7233c7ebfab52bb67652b0981f03c484d766fee5392b7c6" exitCode=0 Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.503066 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef","Type":"ContainerDied","Data":"2aab079900e20f077aec274299cf36a213894e204aae63b260e81da4fa474aa3"} Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.503115 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef","Type":"ContainerDied","Data":"a42b3f735f3b95ffb7233c7ebfab52bb67652b0981f03c484d766fee5392b7c6"} Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.552141 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:55:16 crc kubenswrapper[4745]: I1209 11:55:16.793743 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 09 11:55:17 crc kubenswrapper[4745]: I1209 11:55:17.218960 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:55:17 crc kubenswrapper[4745]: W1209 11:55:17.228937 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0452bd55_b0a3_46a9_a388_6db2e40f4cb7.slice/crio-3d993fc02b6d7627f5ebff8193e42959f19300f1c6d87f7ca61384cdbda0b025 WatchSource:0}: Error finding container 3d993fc02b6d7627f5ebff8193e42959f19300f1c6d87f7ca61384cdbda0b025: Status 404 returned error can't find the container with id 3d993fc02b6d7627f5ebff8193e42959f19300f1c6d87f7ca61384cdbda0b025 Dec 09 11:55:17 crc kubenswrapper[4745]: I1209 11:55:17.643451 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4466c464-f51f-4227-94fc-e216c99fe969" path="/var/lib/kubelet/pods/4466c464-f51f-4227-94fc-e216c99fe969/volumes" Dec 09 11:55:17 crc kubenswrapper[4745]: I1209 11:55:17.644897 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0452bd55-b0a3-46a9-a388-6db2e40f4cb7","Type":"ContainerStarted","Data":"3d993fc02b6d7627f5ebff8193e42959f19300f1c6d87f7ca61384cdbda0b025"} Dec 09 11:55:19 crc kubenswrapper[4745]: I1209 11:55:19.640735 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0452bd55-b0a3-46a9-a388-6db2e40f4cb7","Type":"ContainerStarted","Data":"0d37cddfc8eb090792805f604b6641cd1c3d6607edd7eead42610629d259af0b"} Dec 09 11:55:20 crc kubenswrapper[4745]: I1209 11:55:20.403388 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:20 crc kubenswrapper[4745]: I1209 11:55:20.403858 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:20 crc kubenswrapper[4745]: I1209 11:55:20.705416 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 11:55:20 crc kubenswrapper[4745]: I1209 11:55:20.705474 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 11:55:20 crc kubenswrapper[4745]: I1209 11:55:20.754207 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 11:55:20 crc kubenswrapper[4745]: I1209 11:55:20.760024 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 11:55:21 crc kubenswrapper[4745]: I1209 11:55:21.453493 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9294p" podUID="ca574c56-3920-4a4b-be3b-edbbad9fbb26" containerName="registry-server" probeResult="failure" output=< Dec 09 11:55:21 crc kubenswrapper[4745]: timeout: failed to connect service ":50051" within 1s Dec 09 11:55:21 crc kubenswrapper[4745]: > Dec 09 11:55:21 crc kubenswrapper[4745]: I1209 11:55:21.669143 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 11:55:21 crc kubenswrapper[4745]: I1209 11:55:21.669193 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 11:55:24 crc kubenswrapper[4745]: I1209 11:55:24.072736 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 11:55:24 crc kubenswrapper[4745]: I1209 11:55:24.073435 4745 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 11:55:24 crc kubenswrapper[4745]: I1209 11:55:24.087249 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 11:55:25 crc kubenswrapper[4745]: I1209 11:55:25.717791 4745 generic.go:334] "Generic (PLEG): container finished" podID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerID="0d6c154453bce530427a527aa857fda96a96cb6f1dab40f15fb99cb39e935409" exitCode=0 Dec 09 11:55:25 crc kubenswrapper[4745]: I1209 11:55:25.717880 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef","Type":"ContainerDied","Data":"0d6c154453bce530427a527aa857fda96a96cb6f1dab40f15fb99cb39e935409"} Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.098533 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.112650 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt24f\" (UniqueName: \"kubernetes.io/projected/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-kube-api-access-zt24f\") pod \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.112766 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-log-httpd\") pod \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.112814 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-scripts\") pod \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.112952 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-combined-ca-bundle\") pod \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.113975 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" (UID: "6498f521-af2f-4f1c-b1d4-b33ad2bd8aef"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.114205 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-config-data\") pod \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.114248 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-sg-core-conf-yaml\") pod \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.114290 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-run-httpd\") pod \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\" (UID: \"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef\") " Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.115504 4745 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.115956 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" (UID: "6498f521-af2f-4f1c-b1d4-b33ad2bd8aef"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.119185 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-scripts" (OuterVolumeSpecName: "scripts") pod "6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" (UID: "6498f521-af2f-4f1c-b1d4-b33ad2bd8aef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.121797 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-kube-api-access-zt24f" (OuterVolumeSpecName: "kube-api-access-zt24f") pod "6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" (UID: "6498f521-af2f-4f1c-b1d4-b33ad2bd8aef"). InnerVolumeSpecName "kube-api-access-zt24f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.181263 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" (UID: "6498f521-af2f-4f1c-b1d4-b33ad2bd8aef"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.210763 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" (UID: "6498f521-af2f-4f1c-b1d4-b33ad2bd8aef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.218676 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.218934 4745 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.219125 4745 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.219223 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt24f\" (UniqueName: \"kubernetes.io/projected/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-kube-api-access-zt24f\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.219295 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.268179 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-config-data" (OuterVolumeSpecName: "config-data") pod "6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" (UID: "6498f521-af2f-4f1c-b1d4-b33ad2bd8aef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.321459 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.735990 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6498f521-af2f-4f1c-b1d4-b33ad2bd8aef","Type":"ContainerDied","Data":"4a7fe3e416e43f0fa9a4240c4f4f825d0d0307961cf41ff65eff84023ac68bff"} Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.736056 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.736080 4745 scope.go:117] "RemoveContainer" containerID="deb421994406caafb9675151a6a79a725d8af8d88f67cda7cc025084958f2dfb" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.750618 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qqbmx" event={"ID":"ae7f8892-3148-4488-bfa2-afe44917e31d","Type":"ContainerStarted","Data":"91e93b32cad320971d9e0a25ede2bff04e8efbc6587e0a74698dda3692d9f90a"} Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.758552 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0452bd55-b0a3-46a9-a388-6db2e40f4cb7","Type":"ContainerStarted","Data":"6d7b68aa247edb856bd53e3fca4235d3708496404ba738f128d0babd51b390df"} Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.774945 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qqbmx" podStartSLOduration=2.563930399 podStartE2EDuration="17.77491387s" podCreationTimestamp="2025-12-09 11:55:09 +0000 UTC" firstStartedPulling="2025-12-09 11:55:10.64033126 +0000 UTC m=+1397.465532784" lastFinishedPulling="2025-12-09 11:55:25.851314731 +0000 UTC m=+1412.676516255" observedRunningTime="2025-12-09 11:55:26.774599102 +0000 UTC m=+1413.599800626" watchObservedRunningTime="2025-12-09 11:55:26.77491387 +0000 UTC m=+1413.600115394" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.789851 4745 scope.go:117] "RemoveContainer" containerID="2aab079900e20f077aec274299cf36a213894e204aae63b260e81da4fa474aa3" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.835232 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=11.835206514 podStartE2EDuration="11.835206514s" podCreationTimestamp="2025-12-09 11:55:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:55:26.801293871 +0000 UTC m=+1413.626495405" watchObservedRunningTime="2025-12-09 11:55:26.835206514 +0000 UTC m=+1413.660408038" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.841593 4745 scope.go:117] "RemoveContainer" containerID="a42b3f735f3b95ffb7233c7ebfab52bb67652b0981f03c484d766fee5392b7c6" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.846941 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.869530 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.882363 4745 scope.go:117] "RemoveContainer" containerID="0d6c154453bce530427a527aa857fda96a96cb6f1dab40f15fb99cb39e935409" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.889345 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:26 crc kubenswrapper[4745]: E1209 11:55:26.890136 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerName="sg-core" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.890161 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerName="sg-core" Dec 09 11:55:26 crc kubenswrapper[4745]: E1209 11:55:26.890182 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerName="proxy-httpd" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.890189 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerName="proxy-httpd" Dec 09 11:55:26 crc kubenswrapper[4745]: E1209 11:55:26.890205 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerName="ceilometer-central-agent" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.890211 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerName="ceilometer-central-agent" Dec 09 11:55:26 crc kubenswrapper[4745]: E1209 11:55:26.890225 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerName="ceilometer-notification-agent" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.890233 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerName="ceilometer-notification-agent" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.890441 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerName="ceilometer-central-agent" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.890468 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerName="sg-core" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.890481 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerName="proxy-httpd" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.890490 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" containerName="ceilometer-notification-agent" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.893390 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.895779 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.898028 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 11:55:26 crc kubenswrapper[4745]: I1209 11:55:26.905843 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.060392 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9d8839-c6ce-4e56-a397-335863a377c3-log-httpd\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.060637 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-config-data\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.060734 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb7qs\" (UniqueName: \"kubernetes.io/projected/dc9d8839-c6ce-4e56-a397-335863a377c3-kube-api-access-jb7qs\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.060827 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9d8839-c6ce-4e56-a397-335863a377c3-run-httpd\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.060948 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.061031 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.061378 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-scripts\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.163652 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-scripts\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.163771 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9d8839-c6ce-4e56-a397-335863a377c3-log-httpd\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.163817 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-config-data\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.163850 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb7qs\" (UniqueName: \"kubernetes.io/projected/dc9d8839-c6ce-4e56-a397-335863a377c3-kube-api-access-jb7qs\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.163879 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9d8839-c6ce-4e56-a397-335863a377c3-run-httpd\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.163919 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.163951 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.165207 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9d8839-c6ce-4e56-a397-335863a377c3-log-httpd\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.165295 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9d8839-c6ce-4e56-a397-335863a377c3-run-httpd\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.170152 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.171480 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.171483 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-config-data\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.182567 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-scripts\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.183470 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb7qs\" (UniqueName: \"kubernetes.io/projected/dc9d8839-c6ce-4e56-a397-335863a377c3-kube-api-access-jb7qs\") pod \"ceilometer-0\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.266143 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.575089 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6498f521-af2f-4f1c-b1d4-b33ad2bd8aef" path="/var/lib/kubelet/pods/6498f521-af2f-4f1c-b1d4-b33ad2bd8aef/volumes" Dec 09 11:55:27 crc kubenswrapper[4745]: I1209 11:55:27.768225 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:28 crc kubenswrapper[4745]: I1209 11:55:28.903467 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9d8839-c6ce-4e56-a397-335863a377c3","Type":"ContainerStarted","Data":"42ad34f838c182430b9a5cc4157de9ff41d041d277a8c55fef7ffb07b9316371"} Dec 09 11:55:29 crc kubenswrapper[4745]: I1209 11:55:29.917860 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9d8839-c6ce-4e56-a397-335863a377c3","Type":"ContainerStarted","Data":"fdd4281cb21e227f59fbeb58d35fc78a31eaac42b18900c7dbf85efcf3984c53"} Dec 09 11:55:30 crc kubenswrapper[4745]: I1209 11:55:30.460165 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:30 crc kubenswrapper[4745]: I1209 11:55:30.530809 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:30 crc kubenswrapper[4745]: I1209 11:55:30.712157 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9294p"] Dec 09 11:55:30 crc kubenswrapper[4745]: I1209 11:55:30.933792 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9d8839-c6ce-4e56-a397-335863a377c3","Type":"ContainerStarted","Data":"9fa64109677f9154c6168b65dec18dcfa72d4c2c5f17b2c0c1ef6a6bd17a5a55"} Dec 09 11:55:32 crc kubenswrapper[4745]: I1209 11:55:32.017485 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9d8839-c6ce-4e56-a397-335863a377c3","Type":"ContainerStarted","Data":"f00c10056124b0a377d14c2792d65455e5a9ec5045e4d3191349cccef301a3a8"} Dec 09 11:55:32 crc kubenswrapper[4745]: I1209 11:55:32.017755 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9294p" podUID="ca574c56-3920-4a4b-be3b-edbbad9fbb26" containerName="registry-server" containerID="cri-o://eec6863a2c14d88afc4d631e07dd5f850f64234f44bd97ec2a49470a95dc9eb3" gracePeriod=2 Dec 09 11:55:32 crc kubenswrapper[4745]: I1209 11:55:32.526076 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:32 crc kubenswrapper[4745]: I1209 11:55:32.713741 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca574c56-3920-4a4b-be3b-edbbad9fbb26-catalog-content\") pod \"ca574c56-3920-4a4b-be3b-edbbad9fbb26\" (UID: \"ca574c56-3920-4a4b-be3b-edbbad9fbb26\") " Dec 09 11:55:32 crc kubenswrapper[4745]: I1209 11:55:32.714428 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca574c56-3920-4a4b-be3b-edbbad9fbb26-utilities\") pod \"ca574c56-3920-4a4b-be3b-edbbad9fbb26\" (UID: \"ca574c56-3920-4a4b-be3b-edbbad9fbb26\") " Dec 09 11:55:32 crc kubenswrapper[4745]: I1209 11:55:32.714534 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn8mg\" (UniqueName: \"kubernetes.io/projected/ca574c56-3920-4a4b-be3b-edbbad9fbb26-kube-api-access-wn8mg\") pod \"ca574c56-3920-4a4b-be3b-edbbad9fbb26\" (UID: \"ca574c56-3920-4a4b-be3b-edbbad9fbb26\") " Dec 09 11:55:32 crc kubenswrapper[4745]: I1209 11:55:32.715376 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca574c56-3920-4a4b-be3b-edbbad9fbb26-utilities" (OuterVolumeSpecName: "utilities") pod "ca574c56-3920-4a4b-be3b-edbbad9fbb26" (UID: "ca574c56-3920-4a4b-be3b-edbbad9fbb26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:55:32 crc kubenswrapper[4745]: I1209 11:55:32.716305 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca574c56-3920-4a4b-be3b-edbbad9fbb26-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:32 crc kubenswrapper[4745]: I1209 11:55:32.726066 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca574c56-3920-4a4b-be3b-edbbad9fbb26-kube-api-access-wn8mg" (OuterVolumeSpecName: "kube-api-access-wn8mg") pod "ca574c56-3920-4a4b-be3b-edbbad9fbb26" (UID: "ca574c56-3920-4a4b-be3b-edbbad9fbb26"). InnerVolumeSpecName "kube-api-access-wn8mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:55:32 crc kubenswrapper[4745]: I1209 11:55:32.819154 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn8mg\" (UniqueName: \"kubernetes.io/projected/ca574c56-3920-4a4b-be3b-edbbad9fbb26-kube-api-access-wn8mg\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:32 crc kubenswrapper[4745]: I1209 11:55:32.825336 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca574c56-3920-4a4b-be3b-edbbad9fbb26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca574c56-3920-4a4b-be3b-edbbad9fbb26" (UID: "ca574c56-3920-4a4b-be3b-edbbad9fbb26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:55:32 crc kubenswrapper[4745]: I1209 11:55:32.921767 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca574c56-3920-4a4b-be3b-edbbad9fbb26-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.036616 4745 generic.go:334] "Generic (PLEG): container finished" podID="ca574c56-3920-4a4b-be3b-edbbad9fbb26" containerID="eec6863a2c14d88afc4d631e07dd5f850f64234f44bd97ec2a49470a95dc9eb3" exitCode=0 Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.036776 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9294p" Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.036780 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9294p" event={"ID":"ca574c56-3920-4a4b-be3b-edbbad9fbb26","Type":"ContainerDied","Data":"eec6863a2c14d88afc4d631e07dd5f850f64234f44bd97ec2a49470a95dc9eb3"} Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.037370 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9294p" event={"ID":"ca574c56-3920-4a4b-be3b-edbbad9fbb26","Type":"ContainerDied","Data":"71a56f57d045255e19e61a23950102de6792921a6804f4f9005a02cf2b2a915c"} Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.037400 4745 scope.go:117] "RemoveContainer" containerID="eec6863a2c14d88afc4d631e07dd5f850f64234f44bd97ec2a49470a95dc9eb3" Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.042410 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9d8839-c6ce-4e56-a397-335863a377c3","Type":"ContainerStarted","Data":"ad72ac68be3c1f6b27c0d6fdfa8f7db10351f4ae310c0473fb5ac484083ade9c"} Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.042676 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.067039 4745 scope.go:117] "RemoveContainer" containerID="5529eac0034a1795998b009f65314009d8bff019c6e01a4678a2044728b409b3" Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.073294 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.062366317 podStartE2EDuration="7.073267194s" podCreationTimestamp="2025-12-09 11:55:26 +0000 UTC" firstStartedPulling="2025-12-09 11:55:27.773801076 +0000 UTC m=+1414.599002600" lastFinishedPulling="2025-12-09 11:55:32.784701953 +0000 UTC m=+1419.609903477" observedRunningTime="2025-12-09 11:55:33.06721262 +0000 UTC m=+1419.892414144" watchObservedRunningTime="2025-12-09 11:55:33.073267194 +0000 UTC m=+1419.898468728" Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.106681 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9294p"] Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.116093 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9294p"] Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.116754 4745 scope.go:117] "RemoveContainer" containerID="5d58d6320e5649e6f6dea93f2204e8676438d15acfbce7be4a0b585d589311ce" Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.138656 4745 scope.go:117] "RemoveContainer" containerID="eec6863a2c14d88afc4d631e07dd5f850f64234f44bd97ec2a49470a95dc9eb3" Dec 09 11:55:33 crc kubenswrapper[4745]: E1209 11:55:33.139731 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec6863a2c14d88afc4d631e07dd5f850f64234f44bd97ec2a49470a95dc9eb3\": container with ID starting with eec6863a2c14d88afc4d631e07dd5f850f64234f44bd97ec2a49470a95dc9eb3 not found: ID does not exist" containerID="eec6863a2c14d88afc4d631e07dd5f850f64234f44bd97ec2a49470a95dc9eb3" Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.139800 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec6863a2c14d88afc4d631e07dd5f850f64234f44bd97ec2a49470a95dc9eb3"} err="failed to get container status \"eec6863a2c14d88afc4d631e07dd5f850f64234f44bd97ec2a49470a95dc9eb3\": rpc error: code = NotFound desc = could not find container \"eec6863a2c14d88afc4d631e07dd5f850f64234f44bd97ec2a49470a95dc9eb3\": container with ID starting with eec6863a2c14d88afc4d631e07dd5f850f64234f44bd97ec2a49470a95dc9eb3 not found: ID does not exist" Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.139843 4745 scope.go:117] "RemoveContainer" containerID="5529eac0034a1795998b009f65314009d8bff019c6e01a4678a2044728b409b3" Dec 09 11:55:33 crc kubenswrapper[4745]: E1209 11:55:33.140251 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5529eac0034a1795998b009f65314009d8bff019c6e01a4678a2044728b409b3\": container with ID starting with 5529eac0034a1795998b009f65314009d8bff019c6e01a4678a2044728b409b3 not found: ID does not exist" containerID="5529eac0034a1795998b009f65314009d8bff019c6e01a4678a2044728b409b3" Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.140302 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5529eac0034a1795998b009f65314009d8bff019c6e01a4678a2044728b409b3"} err="failed to get container status \"5529eac0034a1795998b009f65314009d8bff019c6e01a4678a2044728b409b3\": rpc error: code = NotFound desc = could not find container \"5529eac0034a1795998b009f65314009d8bff019c6e01a4678a2044728b409b3\": container with ID starting with 5529eac0034a1795998b009f65314009d8bff019c6e01a4678a2044728b409b3 not found: ID does not exist" Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.140337 4745 scope.go:117] "RemoveContainer" containerID="5d58d6320e5649e6f6dea93f2204e8676438d15acfbce7be4a0b585d589311ce" Dec 09 11:55:33 crc kubenswrapper[4745]: E1209 11:55:33.141369 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d58d6320e5649e6f6dea93f2204e8676438d15acfbce7be4a0b585d589311ce\": container with ID starting with 5d58d6320e5649e6f6dea93f2204e8676438d15acfbce7be4a0b585d589311ce not found: ID does not exist" containerID="5d58d6320e5649e6f6dea93f2204e8676438d15acfbce7be4a0b585d589311ce" Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.141533 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d58d6320e5649e6f6dea93f2204e8676438d15acfbce7be4a0b585d589311ce"} err="failed to get container status \"5d58d6320e5649e6f6dea93f2204e8676438d15acfbce7be4a0b585d589311ce\": rpc error: code = NotFound desc = could not find container \"5d58d6320e5649e6f6dea93f2204e8676438d15acfbce7be4a0b585d589311ce\": container with ID starting with 5d58d6320e5649e6f6dea93f2204e8676438d15acfbce7be4a0b585d589311ce not found: ID does not exist" Dec 09 11:55:33 crc kubenswrapper[4745]: I1209 11:55:33.580295 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca574c56-3920-4a4b-be3b-edbbad9fbb26" path="/var/lib/kubelet/pods/ca574c56-3920-4a4b-be3b-edbbad9fbb26/volumes" Dec 09 11:55:36 crc kubenswrapper[4745]: I1209 11:55:36.553184 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 11:55:36 crc kubenswrapper[4745]: I1209 11:55:36.553595 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 11:55:36 crc kubenswrapper[4745]: I1209 11:55:36.586722 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 11:55:36 crc kubenswrapper[4745]: I1209 11:55:36.618625 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 11:55:37 crc kubenswrapper[4745]: I1209 11:55:37.096911 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 11:55:37 crc kubenswrapper[4745]: I1209 11:55:37.096961 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 11:55:39 crc kubenswrapper[4745]: I1209 11:55:39.137059 4745 generic.go:334] "Generic (PLEG): container finished" podID="ae7f8892-3148-4488-bfa2-afe44917e31d" containerID="91e93b32cad320971d9e0a25ede2bff04e8efbc6587e0a74698dda3692d9f90a" exitCode=0 Dec 09 11:55:39 crc kubenswrapper[4745]: I1209 11:55:39.137140 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qqbmx" event={"ID":"ae7f8892-3148-4488-bfa2-afe44917e31d","Type":"ContainerDied","Data":"91e93b32cad320971d9e0a25ede2bff04e8efbc6587e0a74698dda3692d9f90a"} Dec 09 11:55:39 crc kubenswrapper[4745]: I1209 11:55:39.354764 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 11:55:39 crc kubenswrapper[4745]: I1209 11:55:39.355251 4745 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 11:55:39 crc kubenswrapper[4745]: I1209 11:55:39.891018 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 11:55:40 crc kubenswrapper[4745]: I1209 11:55:40.480049 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qqbmx" Dec 09 11:55:40 crc kubenswrapper[4745]: I1209 11:55:40.553691 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-config-data\") pod \"ae7f8892-3148-4488-bfa2-afe44917e31d\" (UID: \"ae7f8892-3148-4488-bfa2-afe44917e31d\") " Dec 09 11:55:40 crc kubenswrapper[4745]: I1209 11:55:40.553748 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc44p\" (UniqueName: \"kubernetes.io/projected/ae7f8892-3148-4488-bfa2-afe44917e31d-kube-api-access-dc44p\") pod \"ae7f8892-3148-4488-bfa2-afe44917e31d\" (UID: \"ae7f8892-3148-4488-bfa2-afe44917e31d\") " Dec 09 11:55:40 crc kubenswrapper[4745]: I1209 11:55:40.553781 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-scripts\") pod \"ae7f8892-3148-4488-bfa2-afe44917e31d\" (UID: \"ae7f8892-3148-4488-bfa2-afe44917e31d\") " Dec 09 11:55:40 crc kubenswrapper[4745]: I1209 11:55:40.553907 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-combined-ca-bundle\") pod \"ae7f8892-3148-4488-bfa2-afe44917e31d\" (UID: \"ae7f8892-3148-4488-bfa2-afe44917e31d\") " Dec 09 11:55:40 crc kubenswrapper[4745]: I1209 11:55:40.563756 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae7f8892-3148-4488-bfa2-afe44917e31d-kube-api-access-dc44p" (OuterVolumeSpecName: "kube-api-access-dc44p") pod "ae7f8892-3148-4488-bfa2-afe44917e31d" (UID: "ae7f8892-3148-4488-bfa2-afe44917e31d"). InnerVolumeSpecName "kube-api-access-dc44p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:55:40 crc kubenswrapper[4745]: I1209 11:55:40.571761 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-scripts" (OuterVolumeSpecName: "scripts") pod "ae7f8892-3148-4488-bfa2-afe44917e31d" (UID: "ae7f8892-3148-4488-bfa2-afe44917e31d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:40 crc kubenswrapper[4745]: I1209 11:55:40.603489 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-config-data" (OuterVolumeSpecName: "config-data") pod "ae7f8892-3148-4488-bfa2-afe44917e31d" (UID: "ae7f8892-3148-4488-bfa2-afe44917e31d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:40 crc kubenswrapper[4745]: I1209 11:55:40.613393 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae7f8892-3148-4488-bfa2-afe44917e31d" (UID: "ae7f8892-3148-4488-bfa2-afe44917e31d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:40 crc kubenswrapper[4745]: I1209 11:55:40.658178 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:40 crc kubenswrapper[4745]: I1209 11:55:40.658215 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc44p\" (UniqueName: \"kubernetes.io/projected/ae7f8892-3148-4488-bfa2-afe44917e31d-kube-api-access-dc44p\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:40 crc kubenswrapper[4745]: I1209 11:55:40.658227 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:40 crc kubenswrapper[4745]: I1209 11:55:40.658235 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7f8892-3148-4488-bfa2-afe44917e31d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.160261 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qqbmx" event={"ID":"ae7f8892-3148-4488-bfa2-afe44917e31d","Type":"ContainerDied","Data":"a2bb5623605328c8b2e49b242e6d09ac8d147302e5702fb01233374cd9963f79"} Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.160316 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2bb5623605328c8b2e49b242e6d09ac8d147302e5702fb01233374cd9963f79" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.160389 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qqbmx" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.308667 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 11:55:41 crc kubenswrapper[4745]: E1209 11:55:41.310369 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca574c56-3920-4a4b-be3b-edbbad9fbb26" containerName="extract-content" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.310497 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca574c56-3920-4a4b-be3b-edbbad9fbb26" containerName="extract-content" Dec 09 11:55:41 crc kubenswrapper[4745]: E1209 11:55:41.310662 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca574c56-3920-4a4b-be3b-edbbad9fbb26" containerName="registry-server" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.310744 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca574c56-3920-4a4b-be3b-edbbad9fbb26" containerName="registry-server" Dec 09 11:55:41 crc kubenswrapper[4745]: E1209 11:55:41.310832 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7f8892-3148-4488-bfa2-afe44917e31d" containerName="nova-cell0-conductor-db-sync" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.310914 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7f8892-3148-4488-bfa2-afe44917e31d" containerName="nova-cell0-conductor-db-sync" Dec 09 11:55:41 crc kubenswrapper[4745]: E1209 11:55:41.311011 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca574c56-3920-4a4b-be3b-edbbad9fbb26" containerName="extract-utilities" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.311124 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca574c56-3920-4a4b-be3b-edbbad9fbb26" containerName="extract-utilities" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.311495 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7f8892-3148-4488-bfa2-afe44917e31d" containerName="nova-cell0-conductor-db-sync" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.311633 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca574c56-3920-4a4b-be3b-edbbad9fbb26" containerName="registry-server" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.313613 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.319555 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kbcl6" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.327153 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.330757 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.477380 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d388089-75a9-4e64-8fcf-575fde454708-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5d388089-75a9-4e64-8fcf-575fde454708\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.477493 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d388089-75a9-4e64-8fcf-575fde454708-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5d388089-75a9-4e64-8fcf-575fde454708\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.477611 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn8g6\" (UniqueName: \"kubernetes.io/projected/5d388089-75a9-4e64-8fcf-575fde454708-kube-api-access-rn8g6\") pod \"nova-cell0-conductor-0\" (UID: \"5d388089-75a9-4e64-8fcf-575fde454708\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.579907 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d388089-75a9-4e64-8fcf-575fde454708-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5d388089-75a9-4e64-8fcf-575fde454708\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.579996 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn8g6\" (UniqueName: \"kubernetes.io/projected/5d388089-75a9-4e64-8fcf-575fde454708-kube-api-access-rn8g6\") pod \"nova-cell0-conductor-0\" (UID: \"5d388089-75a9-4e64-8fcf-575fde454708\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.580079 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d388089-75a9-4e64-8fcf-575fde454708-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5d388089-75a9-4e64-8fcf-575fde454708\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.587398 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d388089-75a9-4e64-8fcf-575fde454708-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5d388089-75a9-4e64-8fcf-575fde454708\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.607256 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d388089-75a9-4e64-8fcf-575fde454708-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5d388089-75a9-4e64-8fcf-575fde454708\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.627199 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn8g6\" (UniqueName: \"kubernetes.io/projected/5d388089-75a9-4e64-8fcf-575fde454708-kube-api-access-rn8g6\") pod \"nova-cell0-conductor-0\" (UID: \"5d388089-75a9-4e64-8fcf-575fde454708\") " pod="openstack/nova-cell0-conductor-0" Dec 09 11:55:41 crc kubenswrapper[4745]: I1209 11:55:41.648678 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 11:55:42 crc kubenswrapper[4745]: I1209 11:55:42.199565 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 11:55:43 crc kubenswrapper[4745]: I1209 11:55:43.179830 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:43 crc kubenswrapper[4745]: I1209 11:55:43.181244 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="ceilometer-central-agent" containerID="cri-o://fdd4281cb21e227f59fbeb58d35fc78a31eaac42b18900c7dbf85efcf3984c53" gracePeriod=30 Dec 09 11:55:43 crc kubenswrapper[4745]: I1209 11:55:43.181660 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="proxy-httpd" containerID="cri-o://ad72ac68be3c1f6b27c0d6fdfa8f7db10351f4ae310c0473fb5ac484083ade9c" gracePeriod=30 Dec 09 11:55:43 crc kubenswrapper[4745]: I1209 11:55:43.181731 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="sg-core" containerID="cri-o://f00c10056124b0a377d14c2792d65455e5a9ec5045e4d3191349cccef301a3a8" gracePeriod=30 Dec 09 11:55:43 crc kubenswrapper[4745]: I1209 11:55:43.181774 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="ceilometer-notification-agent" containerID="cri-o://9fa64109677f9154c6168b65dec18dcfa72d4c2c5f17b2c0c1ef6a6bd17a5a55" gracePeriod=30 Dec 09 11:55:43 crc kubenswrapper[4745]: I1209 11:55:43.191039 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5d388089-75a9-4e64-8fcf-575fde454708","Type":"ContainerStarted","Data":"5e8c79b90f4d2a11dc79650cd3c734e513694255f605827a249017e68a0d2eae"} Dec 09 11:55:43 crc kubenswrapper[4745]: I1209 11:55:43.191101 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5d388089-75a9-4e64-8fcf-575fde454708","Type":"ContainerStarted","Data":"35f4c5a9c3929502edb9496ad564801a9c47af1cf371c60d7f51177b9e941d10"} Dec 09 11:55:43 crc kubenswrapper[4745]: I1209 11:55:43.192784 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 09 11:55:43 crc kubenswrapper[4745]: I1209 11:55:43.192890 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.180:3000/\": EOF" Dec 09 11:55:43 crc kubenswrapper[4745]: I1209 11:55:43.221224 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.221181791 podStartE2EDuration="2.221181791s" podCreationTimestamp="2025-12-09 11:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:55:43.211052437 +0000 UTC m=+1430.036253971" watchObservedRunningTime="2025-12-09 11:55:43.221181791 +0000 UTC m=+1430.046383315" Dec 09 11:55:44 crc kubenswrapper[4745]: I1209 11:55:44.208504 4745 generic.go:334] "Generic (PLEG): container finished" podID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerID="ad72ac68be3c1f6b27c0d6fdfa8f7db10351f4ae310c0473fb5ac484083ade9c" exitCode=0 Dec 09 11:55:44 crc kubenswrapper[4745]: I1209 11:55:44.208599 4745 generic.go:334] "Generic (PLEG): container finished" podID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerID="f00c10056124b0a377d14c2792d65455e5a9ec5045e4d3191349cccef301a3a8" exitCode=2 Dec 09 11:55:44 crc kubenswrapper[4745]: I1209 11:55:44.208610 4745 generic.go:334] "Generic (PLEG): container finished" podID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerID="fdd4281cb21e227f59fbeb58d35fc78a31eaac42b18900c7dbf85efcf3984c53" exitCode=0 Dec 09 11:55:44 crc kubenswrapper[4745]: I1209 11:55:44.208591 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9d8839-c6ce-4e56-a397-335863a377c3","Type":"ContainerDied","Data":"ad72ac68be3c1f6b27c0d6fdfa8f7db10351f4ae310c0473fb5ac484083ade9c"} Dec 09 11:55:44 crc kubenswrapper[4745]: I1209 11:55:44.208828 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9d8839-c6ce-4e56-a397-335863a377c3","Type":"ContainerDied","Data":"f00c10056124b0a377d14c2792d65455e5a9ec5045e4d3191349cccef301a3a8"} Dec 09 11:55:44 crc kubenswrapper[4745]: I1209 11:55:44.208930 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9d8839-c6ce-4e56-a397-335863a377c3","Type":"ContainerDied","Data":"fdd4281cb21e227f59fbeb58d35fc78a31eaac42b18900c7dbf85efcf3984c53"} Dec 09 11:55:47 crc kubenswrapper[4745]: I1209 11:55:47.802240 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:55:47 crc kubenswrapper[4745]: I1209 11:55:47.922953 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-sg-core-conf-yaml\") pod \"dc9d8839-c6ce-4e56-a397-335863a377c3\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " Dec 09 11:55:47 crc kubenswrapper[4745]: I1209 11:55:47.923107 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-combined-ca-bundle\") pod \"dc9d8839-c6ce-4e56-a397-335863a377c3\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " Dec 09 11:55:47 crc kubenswrapper[4745]: I1209 11:55:47.923235 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9d8839-c6ce-4e56-a397-335863a377c3-run-httpd\") pod \"dc9d8839-c6ce-4e56-a397-335863a377c3\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " Dec 09 11:55:47 crc kubenswrapper[4745]: I1209 11:55:47.923260 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9d8839-c6ce-4e56-a397-335863a377c3-log-httpd\") pod \"dc9d8839-c6ce-4e56-a397-335863a377c3\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " Dec 09 11:55:47 crc kubenswrapper[4745]: I1209 11:55:47.923370 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb7qs\" (UniqueName: \"kubernetes.io/projected/dc9d8839-c6ce-4e56-a397-335863a377c3-kube-api-access-jb7qs\") pod \"dc9d8839-c6ce-4e56-a397-335863a377c3\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " Dec 09 11:55:47 crc kubenswrapper[4745]: I1209 11:55:47.923491 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-config-data\") pod \"dc9d8839-c6ce-4e56-a397-335863a377c3\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " Dec 09 11:55:47 crc kubenswrapper[4745]: I1209 11:55:47.923559 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-scripts\") pod \"dc9d8839-c6ce-4e56-a397-335863a377c3\" (UID: \"dc9d8839-c6ce-4e56-a397-335863a377c3\") " Dec 09 11:55:47 crc kubenswrapper[4745]: I1209 11:55:47.923651 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc9d8839-c6ce-4e56-a397-335863a377c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dc9d8839-c6ce-4e56-a397-335863a377c3" (UID: "dc9d8839-c6ce-4e56-a397-335863a377c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:55:47 crc kubenswrapper[4745]: I1209 11:55:47.925195 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc9d8839-c6ce-4e56-a397-335863a377c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dc9d8839-c6ce-4e56-a397-335863a377c3" (UID: "dc9d8839-c6ce-4e56-a397-335863a377c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:55:47 crc kubenswrapper[4745]: I1209 11:55:47.929796 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-scripts" (OuterVolumeSpecName: "scripts") pod "dc9d8839-c6ce-4e56-a397-335863a377c3" (UID: "dc9d8839-c6ce-4e56-a397-335863a377c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:47 crc kubenswrapper[4745]: I1209 11:55:47.930682 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc9d8839-c6ce-4e56-a397-335863a377c3-kube-api-access-jb7qs" (OuterVolumeSpecName: "kube-api-access-jb7qs") pod "dc9d8839-c6ce-4e56-a397-335863a377c3" (UID: "dc9d8839-c6ce-4e56-a397-335863a377c3"). InnerVolumeSpecName "kube-api-access-jb7qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:55:47 crc kubenswrapper[4745]: I1209 11:55:47.952759 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dc9d8839-c6ce-4e56-a397-335863a377c3" (UID: "dc9d8839-c6ce-4e56-a397-335863a377c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.000155 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc9d8839-c6ce-4e56-a397-335863a377c3" (UID: "dc9d8839-c6ce-4e56-a397-335863a377c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.025912 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.025948 4745 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.025961 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.025970 4745 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9d8839-c6ce-4e56-a397-335863a377c3-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.025980 4745 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9d8839-c6ce-4e56-a397-335863a377c3-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.025989 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb7qs\" (UniqueName: \"kubernetes.io/projected/dc9d8839-c6ce-4e56-a397-335863a377c3-kube-api-access-jb7qs\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.027327 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-config-data" (OuterVolumeSpecName: "config-data") pod "dc9d8839-c6ce-4e56-a397-335863a377c3" (UID: "dc9d8839-c6ce-4e56-a397-335863a377c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.128910 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9d8839-c6ce-4e56-a397-335863a377c3-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.253046 4745 generic.go:334] "Generic (PLEG): container finished" podID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerID="9fa64109677f9154c6168b65dec18dcfa72d4c2c5f17b2c0c1ef6a6bd17a5a55" exitCode=0 Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.253130 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.253139 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9d8839-c6ce-4e56-a397-335863a377c3","Type":"ContainerDied","Data":"9fa64109677f9154c6168b65dec18dcfa72d4c2c5f17b2c0c1ef6a6bd17a5a55"} Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.253299 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9d8839-c6ce-4e56-a397-335863a377c3","Type":"ContainerDied","Data":"42ad34f838c182430b9a5cc4157de9ff41d041d277a8c55fef7ffb07b9316371"} Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.253354 4745 scope.go:117] "RemoveContainer" containerID="ad72ac68be3c1f6b27c0d6fdfa8f7db10351f4ae310c0473fb5ac484083ade9c" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.285759 4745 scope.go:117] "RemoveContainer" containerID="f00c10056124b0a377d14c2792d65455e5a9ec5045e4d3191349cccef301a3a8" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.303791 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.311584 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.328660 4745 scope.go:117] "RemoveContainer" containerID="9fa64109677f9154c6168b65dec18dcfa72d4c2c5f17b2c0c1ef6a6bd17a5a55" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.346008 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:48 crc kubenswrapper[4745]: E1209 11:55:48.346600 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="ceilometer-notification-agent" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.346617 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="ceilometer-notification-agent" Dec 09 11:55:48 crc kubenswrapper[4745]: E1209 11:55:48.346633 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="ceilometer-central-agent" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.346641 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="ceilometer-central-agent" Dec 09 11:55:48 crc kubenswrapper[4745]: E1209 11:55:48.346670 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="sg-core" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.346678 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="sg-core" Dec 09 11:55:48 crc kubenswrapper[4745]: E1209 11:55:48.346724 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="proxy-httpd" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.346734 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="proxy-httpd" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.347002 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="ceilometer-central-agent" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.347023 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="sg-core" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.347072 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="ceilometer-notification-agent" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.347083 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" containerName="proxy-httpd" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.351851 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.356315 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.357167 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.373242 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.388046 4745 scope.go:117] "RemoveContainer" containerID="fdd4281cb21e227f59fbeb58d35fc78a31eaac42b18900c7dbf85efcf3984c53" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.410160 4745 scope.go:117] "RemoveContainer" containerID="ad72ac68be3c1f6b27c0d6fdfa8f7db10351f4ae310c0473fb5ac484083ade9c" Dec 09 11:55:48 crc kubenswrapper[4745]: E1209 11:55:48.410793 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad72ac68be3c1f6b27c0d6fdfa8f7db10351f4ae310c0473fb5ac484083ade9c\": container with ID starting with ad72ac68be3c1f6b27c0d6fdfa8f7db10351f4ae310c0473fb5ac484083ade9c not found: ID does not exist" containerID="ad72ac68be3c1f6b27c0d6fdfa8f7db10351f4ae310c0473fb5ac484083ade9c" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.410837 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad72ac68be3c1f6b27c0d6fdfa8f7db10351f4ae310c0473fb5ac484083ade9c"} err="failed to get container status \"ad72ac68be3c1f6b27c0d6fdfa8f7db10351f4ae310c0473fb5ac484083ade9c\": rpc error: code = NotFound desc = could not find container \"ad72ac68be3c1f6b27c0d6fdfa8f7db10351f4ae310c0473fb5ac484083ade9c\": container with ID starting with ad72ac68be3c1f6b27c0d6fdfa8f7db10351f4ae310c0473fb5ac484083ade9c not found: ID does not exist" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.410866 4745 scope.go:117] "RemoveContainer" containerID="f00c10056124b0a377d14c2792d65455e5a9ec5045e4d3191349cccef301a3a8" Dec 09 11:55:48 crc kubenswrapper[4745]: E1209 11:55:48.411130 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f00c10056124b0a377d14c2792d65455e5a9ec5045e4d3191349cccef301a3a8\": container with ID starting with f00c10056124b0a377d14c2792d65455e5a9ec5045e4d3191349cccef301a3a8 not found: ID does not exist" containerID="f00c10056124b0a377d14c2792d65455e5a9ec5045e4d3191349cccef301a3a8" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.411161 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00c10056124b0a377d14c2792d65455e5a9ec5045e4d3191349cccef301a3a8"} err="failed to get container status \"f00c10056124b0a377d14c2792d65455e5a9ec5045e4d3191349cccef301a3a8\": rpc error: code = NotFound desc = could not find container \"f00c10056124b0a377d14c2792d65455e5a9ec5045e4d3191349cccef301a3a8\": container with ID starting with f00c10056124b0a377d14c2792d65455e5a9ec5045e4d3191349cccef301a3a8 not found: ID does not exist" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.411177 4745 scope.go:117] "RemoveContainer" containerID="9fa64109677f9154c6168b65dec18dcfa72d4c2c5f17b2c0c1ef6a6bd17a5a55" Dec 09 11:55:48 crc kubenswrapper[4745]: E1209 11:55:48.411386 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fa64109677f9154c6168b65dec18dcfa72d4c2c5f17b2c0c1ef6a6bd17a5a55\": container with ID starting with 9fa64109677f9154c6168b65dec18dcfa72d4c2c5f17b2c0c1ef6a6bd17a5a55 not found: ID does not exist" containerID="9fa64109677f9154c6168b65dec18dcfa72d4c2c5f17b2c0c1ef6a6bd17a5a55" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.411419 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fa64109677f9154c6168b65dec18dcfa72d4c2c5f17b2c0c1ef6a6bd17a5a55"} err="failed to get container status \"9fa64109677f9154c6168b65dec18dcfa72d4c2c5f17b2c0c1ef6a6bd17a5a55\": rpc error: code = NotFound desc = could not find container \"9fa64109677f9154c6168b65dec18dcfa72d4c2c5f17b2c0c1ef6a6bd17a5a55\": container with ID starting with 9fa64109677f9154c6168b65dec18dcfa72d4c2c5f17b2c0c1ef6a6bd17a5a55 not found: ID does not exist" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.411437 4745 scope.go:117] "RemoveContainer" containerID="fdd4281cb21e227f59fbeb58d35fc78a31eaac42b18900c7dbf85efcf3984c53" Dec 09 11:55:48 crc kubenswrapper[4745]: E1209 11:55:48.411688 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdd4281cb21e227f59fbeb58d35fc78a31eaac42b18900c7dbf85efcf3984c53\": container with ID starting with fdd4281cb21e227f59fbeb58d35fc78a31eaac42b18900c7dbf85efcf3984c53 not found: ID does not exist" containerID="fdd4281cb21e227f59fbeb58d35fc78a31eaac42b18900c7dbf85efcf3984c53" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.411717 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdd4281cb21e227f59fbeb58d35fc78a31eaac42b18900c7dbf85efcf3984c53"} err="failed to get container status \"fdd4281cb21e227f59fbeb58d35fc78a31eaac42b18900c7dbf85efcf3984c53\": rpc error: code = NotFound desc = could not find container \"fdd4281cb21e227f59fbeb58d35fc78a31eaac42b18900c7dbf85efcf3984c53\": container with ID starting with fdd4281cb21e227f59fbeb58d35fc78a31eaac42b18900c7dbf85efcf3984c53 not found: ID does not exist" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.536439 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt6c4\" (UniqueName: \"kubernetes.io/projected/e53e8e80-072c-424c-97e4-0e78e02db62d-kube-api-access-vt6c4\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.536486 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e53e8e80-072c-424c-97e4-0e78e02db62d-log-httpd\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.536543 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-config-data\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.536699 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.536900 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e53e8e80-072c-424c-97e4-0e78e02db62d-run-httpd\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.536969 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.537043 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-scripts\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.639210 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e53e8e80-072c-424c-97e4-0e78e02db62d-run-httpd\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.639288 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.639342 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-scripts\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.639393 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt6c4\" (UniqueName: \"kubernetes.io/projected/e53e8e80-072c-424c-97e4-0e78e02db62d-kube-api-access-vt6c4\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.639424 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e53e8e80-072c-424c-97e4-0e78e02db62d-log-httpd\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.639453 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-config-data\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.639882 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e53e8e80-072c-424c-97e4-0e78e02db62d-run-httpd\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.639924 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e53e8e80-072c-424c-97e4-0e78e02db62d-log-httpd\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.640014 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.644199 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-scripts\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.644237 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.644994 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-config-data\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.653931 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.674918 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt6c4\" (UniqueName: \"kubernetes.io/projected/e53e8e80-072c-424c-97e4-0e78e02db62d-kube-api-access-vt6c4\") pod \"ceilometer-0\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " pod="openstack/ceilometer-0" Dec 09 11:55:48 crc kubenswrapper[4745]: I1209 11:55:48.680673 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:55:49 crc kubenswrapper[4745]: I1209 11:55:49.156165 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:55:49 crc kubenswrapper[4745]: W1209 11:55:49.157314 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode53e8e80_072c_424c_97e4_0e78e02db62d.slice/crio-1506e2507fab4fa0c7181cc08f8af000d7c9ec6842d0fb97e34ec1541900481e WatchSource:0}: Error finding container 1506e2507fab4fa0c7181cc08f8af000d7c9ec6842d0fb97e34ec1541900481e: Status 404 returned error can't find the container with id 1506e2507fab4fa0c7181cc08f8af000d7c9ec6842d0fb97e34ec1541900481e Dec 09 11:55:49 crc kubenswrapper[4745]: I1209 11:55:49.268583 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e53e8e80-072c-424c-97e4-0e78e02db62d","Type":"ContainerStarted","Data":"1506e2507fab4fa0c7181cc08f8af000d7c9ec6842d0fb97e34ec1541900481e"} Dec 09 11:55:49 crc kubenswrapper[4745]: I1209 11:55:49.566325 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc9d8839-c6ce-4e56-a397-335863a377c3" path="/var/lib/kubelet/pods/dc9d8839-c6ce-4e56-a397-335863a377c3/volumes" Dec 09 11:55:50 crc kubenswrapper[4745]: I1209 11:55:50.280602 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e53e8e80-072c-424c-97e4-0e78e02db62d","Type":"ContainerStarted","Data":"f80d48d2690ec3e2d9c9a4df4950f7bd370e47146539a512e64490e609477d44"} Dec 09 11:55:51 crc kubenswrapper[4745]: I1209 11:55:51.308956 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e53e8e80-072c-424c-97e4-0e78e02db62d","Type":"ContainerStarted","Data":"f2dbe92f98b2b27e3ce8746ec22402c9bfc7f2f6587c0bf5c3f9de33604557a5"} Dec 09 11:55:51 crc kubenswrapper[4745]: I1209 11:55:51.680038 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.250075 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-r7bpq"] Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.252189 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-r7bpq" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.259797 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.262394 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-config-data\") pod \"nova-cell0-cell-mapping-r7bpq\" (UID: \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\") " pod="openstack/nova-cell0-cell-mapping-r7bpq" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.262582 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw2mg\" (UniqueName: \"kubernetes.io/projected/d34f458f-6a4f-416a-96b9-18dfd1bb1452-kube-api-access-dw2mg\") pod \"nova-cell0-cell-mapping-r7bpq\" (UID: \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\") " pod="openstack/nova-cell0-cell-mapping-r7bpq" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.262620 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-scripts\") pod \"nova-cell0-cell-mapping-r7bpq\" (UID: \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\") " pod="openstack/nova-cell0-cell-mapping-r7bpq" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.262688 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-r7bpq\" (UID: \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\") " pod="openstack/nova-cell0-cell-mapping-r7bpq" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.266529 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-r7bpq"] Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.270352 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.362299 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e53e8e80-072c-424c-97e4-0e78e02db62d","Type":"ContainerStarted","Data":"824e0d740bb85ebd55226fd00adc79f5d4ce3e05a0cacd414af41c1a48da06b2"} Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.364110 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-config-data\") pod \"nova-cell0-cell-mapping-r7bpq\" (UID: \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\") " pod="openstack/nova-cell0-cell-mapping-r7bpq" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.364580 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw2mg\" (UniqueName: \"kubernetes.io/projected/d34f458f-6a4f-416a-96b9-18dfd1bb1452-kube-api-access-dw2mg\") pod \"nova-cell0-cell-mapping-r7bpq\" (UID: \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\") " pod="openstack/nova-cell0-cell-mapping-r7bpq" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.364610 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-scripts\") pod \"nova-cell0-cell-mapping-r7bpq\" (UID: \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\") " pod="openstack/nova-cell0-cell-mapping-r7bpq" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.364679 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-r7bpq\" (UID: \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\") " pod="openstack/nova-cell0-cell-mapping-r7bpq" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.372562 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-config-data\") pod \"nova-cell0-cell-mapping-r7bpq\" (UID: \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\") " pod="openstack/nova-cell0-cell-mapping-r7bpq" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.393584 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-scripts\") pod \"nova-cell0-cell-mapping-r7bpq\" (UID: \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\") " pod="openstack/nova-cell0-cell-mapping-r7bpq" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.394447 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-r7bpq\" (UID: \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\") " pod="openstack/nova-cell0-cell-mapping-r7bpq" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.401209 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw2mg\" (UniqueName: \"kubernetes.io/projected/d34f458f-6a4f-416a-96b9-18dfd1bb1452-kube-api-access-dw2mg\") pod \"nova-cell0-cell-mapping-r7bpq\" (UID: \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\") " pod="openstack/nova-cell0-cell-mapping-r7bpq" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.444598 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.446497 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.450019 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.480679 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.574385 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aef32c3-9850-494c-a370-3207772e6d07-config-data\") pod \"nova-api-0\" (UID: \"8aef32c3-9850-494c-a370-3207772e6d07\") " pod="openstack/nova-api-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.574465 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aef32c3-9850-494c-a370-3207772e6d07-logs\") pod \"nova-api-0\" (UID: \"8aef32c3-9850-494c-a370-3207772e6d07\") " pod="openstack/nova-api-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.574529 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgwpz\" (UniqueName: \"kubernetes.io/projected/8aef32c3-9850-494c-a370-3207772e6d07-kube-api-access-mgwpz\") pod \"nova-api-0\" (UID: \"8aef32c3-9850-494c-a370-3207772e6d07\") " pod="openstack/nova-api-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.574594 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aef32c3-9850-494c-a370-3207772e6d07-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8aef32c3-9850-494c-a370-3207772e6d07\") " pod="openstack/nova-api-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.583733 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-r7bpq" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.623139 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.624669 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.629501 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.642191 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.690583 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-config-data\") pod \"nova-scheduler-0\" (UID: \"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b\") " pod="openstack/nova-scheduler-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.690981 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aef32c3-9850-494c-a370-3207772e6d07-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8aef32c3-9850-494c-a370-3207772e6d07\") " pod="openstack/nova-api-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.691108 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aef32c3-9850-494c-a370-3207772e6d07-config-data\") pod \"nova-api-0\" (UID: \"8aef32c3-9850-494c-a370-3207772e6d07\") " pod="openstack/nova-api-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.691206 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cz6r\" (UniqueName: \"kubernetes.io/projected/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-kube-api-access-6cz6r\") pod \"nova-scheduler-0\" (UID: \"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b\") " pod="openstack/nova-scheduler-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.691227 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aef32c3-9850-494c-a370-3207772e6d07-logs\") pod \"nova-api-0\" (UID: \"8aef32c3-9850-494c-a370-3207772e6d07\") " pod="openstack/nova-api-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.691306 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgwpz\" (UniqueName: \"kubernetes.io/projected/8aef32c3-9850-494c-a370-3207772e6d07-kube-api-access-mgwpz\") pod \"nova-api-0\" (UID: \"8aef32c3-9850-494c-a370-3207772e6d07\") " pod="openstack/nova-api-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.691357 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b\") " pod="openstack/nova-scheduler-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.697041 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aef32c3-9850-494c-a370-3207772e6d07-logs\") pod \"nova-api-0\" (UID: \"8aef32c3-9850-494c-a370-3207772e6d07\") " pod="openstack/nova-api-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.721305 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgwpz\" (UniqueName: \"kubernetes.io/projected/8aef32c3-9850-494c-a370-3207772e6d07-kube-api-access-mgwpz\") pod \"nova-api-0\" (UID: \"8aef32c3-9850-494c-a370-3207772e6d07\") " pod="openstack/nova-api-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.726359 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aef32c3-9850-494c-a370-3207772e6d07-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8aef32c3-9850-494c-a370-3207772e6d07\") " pod="openstack/nova-api-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.737539 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aef32c3-9850-494c-a370-3207772e6d07-config-data\") pod \"nova-api-0\" (UID: \"8aef32c3-9850-494c-a370-3207772e6d07\") " pod="openstack/nova-api-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.789035 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.792783 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cz6r\" (UniqueName: \"kubernetes.io/projected/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-kube-api-access-6cz6r\") pod \"nova-scheduler-0\" (UID: \"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b\") " pod="openstack/nova-scheduler-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.796361 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b\") " pod="openstack/nova-scheduler-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.796531 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-config-data\") pod \"nova-scheduler-0\" (UID: \"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b\") " pod="openstack/nova-scheduler-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.793775 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.815452 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.818449 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b\") " pod="openstack/nova-scheduler-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.819165 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.820475 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-config-data\") pod \"nova-scheduler-0\" (UID: \"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b\") " pod="openstack/nova-scheduler-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.824587 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.838081 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cz6r\" (UniqueName: \"kubernetes.io/projected/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-kube-api-access-6cz6r\") pod \"nova-scheduler-0\" (UID: \"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b\") " pod="openstack/nova-scheduler-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.870840 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.872186 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.910700 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.918123 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b7a03-cf8d-44cc-abd1-05219561080c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"734b7a03-cf8d-44cc-abd1-05219561080c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.918322 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gn8w\" (UniqueName: \"kubernetes.io/projected/734b7a03-cf8d-44cc-abd1-05219561080c-kube-api-access-7gn8w\") pod \"nova-cell1-novncproxy-0\" (UID: \"734b7a03-cf8d-44cc-abd1-05219561080c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.921295 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adf409d4-e93e-4a5d-8161-60841cb2e21f-logs\") pod \"nova-metadata-0\" (UID: \"adf409d4-e93e-4a5d-8161-60841cb2e21f\") " pod="openstack/nova-metadata-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.928641 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhg4q\" (UniqueName: \"kubernetes.io/projected/adf409d4-e93e-4a5d-8161-60841cb2e21f-kube-api-access-fhg4q\") pod \"nova-metadata-0\" (UID: \"adf409d4-e93e-4a5d-8161-60841cb2e21f\") " pod="openstack/nova-metadata-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.928814 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf409d4-e93e-4a5d-8161-60841cb2e21f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"adf409d4-e93e-4a5d-8161-60841cb2e21f\") " pod="openstack/nova-metadata-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.928891 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adf409d4-e93e-4a5d-8161-60841cb2e21f-config-data\") pod \"nova-metadata-0\" (UID: \"adf409d4-e93e-4a5d-8161-60841cb2e21f\") " pod="openstack/nova-metadata-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.928913 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b7a03-cf8d-44cc-abd1-05219561080c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"734b7a03-cf8d-44cc-abd1-05219561080c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.951867 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.957066 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:55:52 crc kubenswrapper[4745]: I1209 11:55:52.991875 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bfb54f9b5-ln8w2"] Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.027627 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.192533 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gn8w\" (UniqueName: \"kubernetes.io/projected/734b7a03-cf8d-44cc-abd1-05219561080c-kube-api-access-7gn8w\") pod \"nova-cell1-novncproxy-0\" (UID: \"734b7a03-cf8d-44cc-abd1-05219561080c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.209378 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adf409d4-e93e-4a5d-8161-60841cb2e21f-logs\") pod \"nova-metadata-0\" (UID: \"adf409d4-e93e-4a5d-8161-60841cb2e21f\") " pod="openstack/nova-metadata-0" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.211479 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhg4q\" (UniqueName: \"kubernetes.io/projected/adf409d4-e93e-4a5d-8161-60841cb2e21f-kube-api-access-fhg4q\") pod \"nova-metadata-0\" (UID: \"adf409d4-e93e-4a5d-8161-60841cb2e21f\") " pod="openstack/nova-metadata-0" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.237420 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf409d4-e93e-4a5d-8161-60841cb2e21f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"adf409d4-e93e-4a5d-8161-60841cb2e21f\") " pod="openstack/nova-metadata-0" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.237546 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adf409d4-e93e-4a5d-8161-60841cb2e21f-config-data\") pod \"nova-metadata-0\" (UID: \"adf409d4-e93e-4a5d-8161-60841cb2e21f\") " pod="openstack/nova-metadata-0" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.237578 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b7a03-cf8d-44cc-abd1-05219561080c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"734b7a03-cf8d-44cc-abd1-05219561080c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.237650 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b7a03-cf8d-44cc-abd1-05219561080c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"734b7a03-cf8d-44cc-abd1-05219561080c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.241591 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adf409d4-e93e-4a5d-8161-60841cb2e21f-logs\") pod \"nova-metadata-0\" (UID: \"adf409d4-e93e-4a5d-8161-60841cb2e21f\") " pod="openstack/nova-metadata-0" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.248491 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf409d4-e93e-4a5d-8161-60841cb2e21f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"adf409d4-e93e-4a5d-8161-60841cb2e21f\") " pod="openstack/nova-metadata-0" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.250900 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gn8w\" (UniqueName: \"kubernetes.io/projected/734b7a03-cf8d-44cc-abd1-05219561080c-kube-api-access-7gn8w\") pod \"nova-cell1-novncproxy-0\" (UID: \"734b7a03-cf8d-44cc-abd1-05219561080c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.255661 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b7a03-cf8d-44cc-abd1-05219561080c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"734b7a03-cf8d-44cc-abd1-05219561080c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.257914 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bfb54f9b5-ln8w2"] Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.264062 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b7a03-cf8d-44cc-abd1-05219561080c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"734b7a03-cf8d-44cc-abd1-05219561080c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.264081 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhg4q\" (UniqueName: \"kubernetes.io/projected/adf409d4-e93e-4a5d-8161-60841cb2e21f-kube-api-access-fhg4q\") pod \"nova-metadata-0\" (UID: \"adf409d4-e93e-4a5d-8161-60841cb2e21f\") " pod="openstack/nova-metadata-0" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.265383 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adf409d4-e93e-4a5d-8161-60841cb2e21f-config-data\") pod \"nova-metadata-0\" (UID: \"adf409d4-e93e-4a5d-8161-60841cb2e21f\") " pod="openstack/nova-metadata-0" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.269579 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.339781 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-ovsdbserver-sb\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.339840 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zppx2\" (UniqueName: \"kubernetes.io/projected/943383c2-4d4b-47bd-bb39-1d671795e573-kube-api-access-zppx2\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.344593 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-ovsdbserver-nb\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.344677 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-config\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.344708 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-dns-svc\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.344830 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-dns-swift-storage-0\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.447854 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-dns-swift-storage-0\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.448395 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-ovsdbserver-sb\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.448433 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zppx2\" (UniqueName: \"kubernetes.io/projected/943383c2-4d4b-47bd-bb39-1d671795e573-kube-api-access-zppx2\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.448574 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-ovsdbserver-nb\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.448648 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-config\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.448680 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-dns-svc\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.450445 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-dns-svc\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.451611 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-dns-swift-storage-0\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.452449 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-ovsdbserver-sb\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.452544 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-ovsdbserver-nb\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.453360 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-config\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.481624 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zppx2\" (UniqueName: \"kubernetes.io/projected/943383c2-4d4b-47bd-bb39-1d671795e573-kube-api-access-zppx2\") pod \"dnsmasq-dns-5bfb54f9b5-ln8w2\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.546241 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.571926 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.702894 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-r7bpq"] Dec 09 11:55:53 crc kubenswrapper[4745]: W1209 11:55:53.733203 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd34f458f_6a4f_416a_96b9_18dfd1bb1452.slice/crio-9cded2cc4bbf356e71100d148f6eacb6bdba2dcaa5cfc01cb20752fbb2bdde35 WatchSource:0}: Error finding container 9cded2cc4bbf356e71100d148f6eacb6bdba2dcaa5cfc01cb20752fbb2bdde35: Status 404 returned error can't find the container with id 9cded2cc4bbf356e71100d148f6eacb6bdba2dcaa5cfc01cb20752fbb2bdde35 Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.754073 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zd8rf"] Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.755438 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zd8rf" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.758902 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.766486 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.792413 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zd8rf"] Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.866092 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zd8rf\" (UID: \"8df4272b-8d98-4788-8513-1f3ab014775c\") " pod="openstack/nova-cell1-conductor-db-sync-zd8rf" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.866298 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-config-data\") pod \"nova-cell1-conductor-db-sync-zd8rf\" (UID: \"8df4272b-8d98-4788-8513-1f3ab014775c\") " pod="openstack/nova-cell1-conductor-db-sync-zd8rf" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.866441 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv7x5\" (UniqueName: \"kubernetes.io/projected/8df4272b-8d98-4788-8513-1f3ab014775c-kube-api-access-cv7x5\") pod \"nova-cell1-conductor-db-sync-zd8rf\" (UID: \"8df4272b-8d98-4788-8513-1f3ab014775c\") " pod="openstack/nova-cell1-conductor-db-sync-zd8rf" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.866482 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-scripts\") pod \"nova-cell1-conductor-db-sync-zd8rf\" (UID: \"8df4272b-8d98-4788-8513-1f3ab014775c\") " pod="openstack/nova-cell1-conductor-db-sync-zd8rf" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.938224 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.962107 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.970836 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-config-data\") pod \"nova-cell1-conductor-db-sync-zd8rf\" (UID: \"8df4272b-8d98-4788-8513-1f3ab014775c\") " pod="openstack/nova-cell1-conductor-db-sync-zd8rf" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.970950 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv7x5\" (UniqueName: \"kubernetes.io/projected/8df4272b-8d98-4788-8513-1f3ab014775c-kube-api-access-cv7x5\") pod \"nova-cell1-conductor-db-sync-zd8rf\" (UID: \"8df4272b-8d98-4788-8513-1f3ab014775c\") " pod="openstack/nova-cell1-conductor-db-sync-zd8rf" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.970983 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-scripts\") pod \"nova-cell1-conductor-db-sync-zd8rf\" (UID: \"8df4272b-8d98-4788-8513-1f3ab014775c\") " pod="openstack/nova-cell1-conductor-db-sync-zd8rf" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.971035 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zd8rf\" (UID: \"8df4272b-8d98-4788-8513-1f3ab014775c\") " pod="openstack/nova-cell1-conductor-db-sync-zd8rf" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.976772 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zd8rf\" (UID: \"8df4272b-8d98-4788-8513-1f3ab014775c\") " pod="openstack/nova-cell1-conductor-db-sync-zd8rf" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.979852 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-config-data\") pod \"nova-cell1-conductor-db-sync-zd8rf\" (UID: \"8df4272b-8d98-4788-8513-1f3ab014775c\") " pod="openstack/nova-cell1-conductor-db-sync-zd8rf" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.985903 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-scripts\") pod \"nova-cell1-conductor-db-sync-zd8rf\" (UID: \"8df4272b-8d98-4788-8513-1f3ab014775c\") " pod="openstack/nova-cell1-conductor-db-sync-zd8rf" Dec 09 11:55:53 crc kubenswrapper[4745]: I1209 11:55:53.999701 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv7x5\" (UniqueName: \"kubernetes.io/projected/8df4272b-8d98-4788-8513-1f3ab014775c-kube-api-access-cv7x5\") pod \"nova-cell1-conductor-db-sync-zd8rf\" (UID: \"8df4272b-8d98-4788-8513-1f3ab014775c\") " pod="openstack/nova-cell1-conductor-db-sync-zd8rf" Dec 09 11:55:54 crc kubenswrapper[4745]: I1209 11:55:54.095349 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zd8rf" Dec 09 11:55:54 crc kubenswrapper[4745]: I1209 11:55:54.226737 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:55:54 crc kubenswrapper[4745]: I1209 11:55:54.244214 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:55:54 crc kubenswrapper[4745]: I1209 11:55:54.375319 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bfb54f9b5-ln8w2"] Dec 09 11:55:54 crc kubenswrapper[4745]: I1209 11:55:54.424321 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" event={"ID":"943383c2-4d4b-47bd-bb39-1d671795e573","Type":"ContainerStarted","Data":"dd07bf0a06c42cffdadf80a72d9647a1f53aa0fdf35828807e87ee742a13915e"} Dec 09 11:55:54 crc kubenswrapper[4745]: I1209 11:55:54.427938 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aef32c3-9850-494c-a370-3207772e6d07","Type":"ContainerStarted","Data":"c62c3ebd05e2cd2ed9713fa2816f00d8063a54b7665aa30cdb7898dade1e8402"} Dec 09 11:55:54 crc kubenswrapper[4745]: I1209 11:55:54.432540 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"adf409d4-e93e-4a5d-8161-60841cb2e21f","Type":"ContainerStarted","Data":"d6c7f691fd18c57f3a26fb8d11d35a9a621c18143ce5e9f54714c1c7fcc01232"} Dec 09 11:55:54 crc kubenswrapper[4745]: I1209 11:55:54.439973 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e53e8e80-072c-424c-97e4-0e78e02db62d","Type":"ContainerStarted","Data":"dfe7e6bcea7cfb1bbbf79eaed213e9e18729b9c1ebaf8fc889a77be7f331d6e3"} Dec 09 11:55:54 crc kubenswrapper[4745]: I1209 11:55:54.441025 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 11:55:54 crc kubenswrapper[4745]: I1209 11:55:54.443275 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-r7bpq" event={"ID":"d34f458f-6a4f-416a-96b9-18dfd1bb1452","Type":"ContainerStarted","Data":"71590042ec928a9bfe46dae653af9e69e9269e11cd2b38ede0e4c60102844e9d"} Dec 09 11:55:54 crc kubenswrapper[4745]: I1209 11:55:54.443365 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-r7bpq" event={"ID":"d34f458f-6a4f-416a-96b9-18dfd1bb1452","Type":"ContainerStarted","Data":"9cded2cc4bbf356e71100d148f6eacb6bdba2dcaa5cfc01cb20752fbb2bdde35"} Dec 09 11:55:54 crc kubenswrapper[4745]: I1209 11:55:54.445266 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b","Type":"ContainerStarted","Data":"14777599fa86159cc031c09c718e9680ca048cff1813b17c6f27f39ff7d14b6e"} Dec 09 11:55:54 crc kubenswrapper[4745]: I1209 11:55:54.448190 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"734b7a03-cf8d-44cc-abd1-05219561080c","Type":"ContainerStarted","Data":"079d4f417559e492cc8c54bf3f9d2d88cc82916f811a58c33552b6c7063103e2"} Dec 09 11:55:54 crc kubenswrapper[4745]: I1209 11:55:54.497390 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.984371462 podStartE2EDuration="6.497369054s" podCreationTimestamp="2025-12-09 11:55:48 +0000 UTC" firstStartedPulling="2025-12-09 11:55:49.159733814 +0000 UTC m=+1435.984935338" lastFinishedPulling="2025-12-09 11:55:52.672731406 +0000 UTC m=+1439.497932930" observedRunningTime="2025-12-09 11:55:54.491564278 +0000 UTC m=+1441.316765802" watchObservedRunningTime="2025-12-09 11:55:54.497369054 +0000 UTC m=+1441.322570578" Dec 09 11:55:54 crc kubenswrapper[4745]: I1209 11:55:54.554096 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-r7bpq" podStartSLOduration=2.554063443 podStartE2EDuration="2.554063443s" podCreationTimestamp="2025-12-09 11:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:55:54.506779638 +0000 UTC m=+1441.331981162" watchObservedRunningTime="2025-12-09 11:55:54.554063443 +0000 UTC m=+1441.379264967" Dec 09 11:55:54 crc kubenswrapper[4745]: I1209 11:55:54.659483 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zd8rf"] Dec 09 11:55:54 crc kubenswrapper[4745]: W1209 11:55:54.689763 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8df4272b_8d98_4788_8513_1f3ab014775c.slice/crio-2767ff3c1c7ea5568c740f5d2d9880053190b5e5f45eb7a090750c0fa3c93e54 WatchSource:0}: Error finding container 2767ff3c1c7ea5568c740f5d2d9880053190b5e5f45eb7a090750c0fa3c93e54: Status 404 returned error can't find the container with id 2767ff3c1c7ea5568c740f5d2d9880053190b5e5f45eb7a090750c0fa3c93e54 Dec 09 11:55:55 crc kubenswrapper[4745]: I1209 11:55:55.497118 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zd8rf" event={"ID":"8df4272b-8d98-4788-8513-1f3ab014775c","Type":"ContainerStarted","Data":"38870d77b6d142f1c1cade48e3d884f4d22c05a1003d9c79fc17066cda6fa023"} Dec 09 11:55:55 crc kubenswrapper[4745]: I1209 11:55:55.497497 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zd8rf" event={"ID":"8df4272b-8d98-4788-8513-1f3ab014775c","Type":"ContainerStarted","Data":"2767ff3c1c7ea5568c740f5d2d9880053190b5e5f45eb7a090750c0fa3c93e54"} Dec 09 11:55:55 crc kubenswrapper[4745]: I1209 11:55:55.501630 4745 generic.go:334] "Generic (PLEG): container finished" podID="943383c2-4d4b-47bd-bb39-1d671795e573" containerID="bb8d2b4217a393f956597ec47aee5cf1a2a5a42823f6874f39aab5832c31fef0" exitCode=0 Dec 09 11:55:55 crc kubenswrapper[4745]: I1209 11:55:55.503155 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" event={"ID":"943383c2-4d4b-47bd-bb39-1d671795e573","Type":"ContainerDied","Data":"bb8d2b4217a393f956597ec47aee5cf1a2a5a42823f6874f39aab5832c31fef0"} Dec 09 11:55:55 crc kubenswrapper[4745]: I1209 11:55:55.522975 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-zd8rf" podStartSLOduration=2.522950787 podStartE2EDuration="2.522950787s" podCreationTimestamp="2025-12-09 11:55:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:55:55.519428452 +0000 UTC m=+1442.344629986" watchObservedRunningTime="2025-12-09 11:55:55.522950787 +0000 UTC m=+1442.348152311" Dec 09 11:55:56 crc kubenswrapper[4745]: I1209 11:55:56.467745 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:55:56 crc kubenswrapper[4745]: I1209 11:55:56.479600 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:55:58 crc kubenswrapper[4745]: I1209 11:55:58.561660 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b","Type":"ContainerStarted","Data":"67909808c01b9d507ca5559c6b0f6e7493fcfc0215e858aff334671f71d7be18"} Dec 09 11:55:58 crc kubenswrapper[4745]: I1209 11:55:58.590250 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.489080971 podStartE2EDuration="6.59021921s" podCreationTimestamp="2025-12-09 11:55:52 +0000 UTC" firstStartedPulling="2025-12-09 11:55:53.956990424 +0000 UTC m=+1440.782191948" lastFinishedPulling="2025-12-09 11:55:58.058128663 +0000 UTC m=+1444.883330187" observedRunningTime="2025-12-09 11:55:58.57870169 +0000 UTC m=+1445.403903214" watchObservedRunningTime="2025-12-09 11:55:58.59021921 +0000 UTC m=+1445.415420734" Dec 09 11:55:59 crc kubenswrapper[4745]: I1209 11:55:59.578198 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"adf409d4-e93e-4a5d-8161-60841cb2e21f","Type":"ContainerStarted","Data":"334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508"} Dec 09 11:55:59 crc kubenswrapper[4745]: I1209 11:55:59.578584 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"adf409d4-e93e-4a5d-8161-60841cb2e21f","Type":"ContainerStarted","Data":"416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22"} Dec 09 11:55:59 crc kubenswrapper[4745]: I1209 11:55:59.578360 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="adf409d4-e93e-4a5d-8161-60841cb2e21f" containerName="nova-metadata-log" containerID="cri-o://416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22" gracePeriod=30 Dec 09 11:55:59 crc kubenswrapper[4745]: I1209 11:55:59.578490 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="adf409d4-e93e-4a5d-8161-60841cb2e21f" containerName="nova-metadata-metadata" containerID="cri-o://334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508" gracePeriod=30 Dec 09 11:55:59 crc kubenswrapper[4745]: I1209 11:55:59.582686 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"734b7a03-cf8d-44cc-abd1-05219561080c","Type":"ContainerStarted","Data":"8a30b773a5a2aed0dc4ec1abd6b91b7918c1c394c635ca0dab7b6e7c9dc64513"} Dec 09 11:55:59 crc kubenswrapper[4745]: I1209 11:55:59.583070 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="734b7a03-cf8d-44cc-abd1-05219561080c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://8a30b773a5a2aed0dc4ec1abd6b91b7918c1c394c635ca0dab7b6e7c9dc64513" gracePeriod=30 Dec 09 11:55:59 crc kubenswrapper[4745]: I1209 11:55:59.592280 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" event={"ID":"943383c2-4d4b-47bd-bb39-1d671795e573","Type":"ContainerStarted","Data":"76271e94b457866083056a11543c6a26ba1888d918c47a9aedefee7c65b6293d"} Dec 09 11:55:59 crc kubenswrapper[4745]: I1209 11:55:59.592356 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:55:59 crc kubenswrapper[4745]: I1209 11:55:59.606961 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aef32c3-9850-494c-a370-3207772e6d07","Type":"ContainerStarted","Data":"eaf018e0afd5899426dc1101e9d098617d96d7dd35536e09500578ce4eeeba26"} Dec 09 11:55:59 crc kubenswrapper[4745]: I1209 11:55:59.607009 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aef32c3-9850-494c-a370-3207772e6d07","Type":"ContainerStarted","Data":"fd3eec2bc5053cbd1956cb5232c9b61af79359fb2f0cdcd9ed10fed05fb00cd8"} Dec 09 11:55:59 crc kubenswrapper[4745]: I1209 11:55:59.617612 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.831582038 podStartE2EDuration="7.617584301s" podCreationTimestamp="2025-12-09 11:55:52 +0000 UTC" firstStartedPulling="2025-12-09 11:55:54.271218126 +0000 UTC m=+1441.096419650" lastFinishedPulling="2025-12-09 11:55:58.057220389 +0000 UTC m=+1444.882421913" observedRunningTime="2025-12-09 11:55:59.603163912 +0000 UTC m=+1446.428365436" watchObservedRunningTime="2025-12-09 11:55:59.617584301 +0000 UTC m=+1446.442785825" Dec 09 11:55:59 crc kubenswrapper[4745]: I1209 11:55:59.637117 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.8339358519999998 podStartE2EDuration="7.637076566s" podCreationTimestamp="2025-12-09 11:55:52 +0000 UTC" firstStartedPulling="2025-12-09 11:55:54.252395809 +0000 UTC m=+1441.077597333" lastFinishedPulling="2025-12-09 11:55:58.055536513 +0000 UTC m=+1444.880738047" observedRunningTime="2025-12-09 11:55:59.624919688 +0000 UTC m=+1446.450121222" watchObservedRunningTime="2025-12-09 11:55:59.637076566 +0000 UTC m=+1446.462278110" Dec 09 11:55:59 crc kubenswrapper[4745]: I1209 11:55:59.664032 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" podStartSLOduration=7.664005252 podStartE2EDuration="7.664005252s" podCreationTimestamp="2025-12-09 11:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:55:59.656396037 +0000 UTC m=+1446.481597561" watchObservedRunningTime="2025-12-09 11:55:59.664005252 +0000 UTC m=+1446.489206776" Dec 09 11:55:59 crc kubenswrapper[4745]: I1209 11:55:59.681085 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.5937988069999998 podStartE2EDuration="7.681051482s" podCreationTimestamp="2025-12-09 11:55:52 +0000 UTC" firstStartedPulling="2025-12-09 11:55:53.984527406 +0000 UTC m=+1440.809728940" lastFinishedPulling="2025-12-09 11:55:58.071780081 +0000 UTC m=+1444.896981615" observedRunningTime="2025-12-09 11:55:59.672348087 +0000 UTC m=+1446.497549611" watchObservedRunningTime="2025-12-09 11:55:59.681051482 +0000 UTC m=+1446.506253006" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.587531 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.619911 4745 generic.go:334] "Generic (PLEG): container finished" podID="adf409d4-e93e-4a5d-8161-60841cb2e21f" containerID="334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508" exitCode=0 Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.620005 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.620025 4745 generic.go:334] "Generic (PLEG): container finished" podID="adf409d4-e93e-4a5d-8161-60841cb2e21f" containerID="416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22" exitCode=143 Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.619986 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"adf409d4-e93e-4a5d-8161-60841cb2e21f","Type":"ContainerDied","Data":"334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508"} Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.621193 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"adf409d4-e93e-4a5d-8161-60841cb2e21f","Type":"ContainerDied","Data":"416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22"} Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.621232 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"adf409d4-e93e-4a5d-8161-60841cb2e21f","Type":"ContainerDied","Data":"d6c7f691fd18c57f3a26fb8d11d35a9a621c18143ce5e9f54714c1c7fcc01232"} Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.621258 4745 scope.go:117] "RemoveContainer" containerID="334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.647884 4745 scope.go:117] "RemoveContainer" containerID="416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.656828 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adf409d4-e93e-4a5d-8161-60841cb2e21f-logs\") pod \"adf409d4-e93e-4a5d-8161-60841cb2e21f\" (UID: \"adf409d4-e93e-4a5d-8161-60841cb2e21f\") " Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.656892 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf409d4-e93e-4a5d-8161-60841cb2e21f-combined-ca-bundle\") pod \"adf409d4-e93e-4a5d-8161-60841cb2e21f\" (UID: \"adf409d4-e93e-4a5d-8161-60841cb2e21f\") " Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.656947 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhg4q\" (UniqueName: \"kubernetes.io/projected/adf409d4-e93e-4a5d-8161-60841cb2e21f-kube-api-access-fhg4q\") pod \"adf409d4-e93e-4a5d-8161-60841cb2e21f\" (UID: \"adf409d4-e93e-4a5d-8161-60841cb2e21f\") " Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.656969 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adf409d4-e93e-4a5d-8161-60841cb2e21f-config-data\") pod \"adf409d4-e93e-4a5d-8161-60841cb2e21f\" (UID: \"adf409d4-e93e-4a5d-8161-60841cb2e21f\") " Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.657564 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adf409d4-e93e-4a5d-8161-60841cb2e21f-logs" (OuterVolumeSpecName: "logs") pod "adf409d4-e93e-4a5d-8161-60841cb2e21f" (UID: "adf409d4-e93e-4a5d-8161-60841cb2e21f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.661777 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adf409d4-e93e-4a5d-8161-60841cb2e21f-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.685373 4745 scope.go:117] "RemoveContainer" containerID="334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.685882 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf409d4-e93e-4a5d-8161-60841cb2e21f-kube-api-access-fhg4q" (OuterVolumeSpecName: "kube-api-access-fhg4q") pod "adf409d4-e93e-4a5d-8161-60841cb2e21f" (UID: "adf409d4-e93e-4a5d-8161-60841cb2e21f"). InnerVolumeSpecName "kube-api-access-fhg4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:00 crc kubenswrapper[4745]: E1209 11:56:00.686210 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508\": container with ID starting with 334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508 not found: ID does not exist" containerID="334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.686269 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508"} err="failed to get container status \"334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508\": rpc error: code = NotFound desc = could not find container \"334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508\": container with ID starting with 334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508 not found: ID does not exist" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.686300 4745 scope.go:117] "RemoveContainer" containerID="416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22" Dec 09 11:56:00 crc kubenswrapper[4745]: E1209 11:56:00.686876 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22\": container with ID starting with 416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22 not found: ID does not exist" containerID="416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.686933 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22"} err="failed to get container status \"416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22\": rpc error: code = NotFound desc = could not find container \"416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22\": container with ID starting with 416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22 not found: ID does not exist" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.686979 4745 scope.go:117] "RemoveContainer" containerID="334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.687687 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508"} err="failed to get container status \"334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508\": rpc error: code = NotFound desc = could not find container \"334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508\": container with ID starting with 334c884b2da6c7727d19aea49c2da65959d81cebcb0e217cc17eb10e2ba1e508 not found: ID does not exist" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.687750 4745 scope.go:117] "RemoveContainer" containerID="416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.688076 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22"} err="failed to get container status \"416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22\": rpc error: code = NotFound desc = could not find container \"416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22\": container with ID starting with 416f7f21b2f23d3ed46359e989a85f2dc8c24158fdac04ce2983eccbf9aaee22 not found: ID does not exist" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.688652 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf409d4-e93e-4a5d-8161-60841cb2e21f-config-data" (OuterVolumeSpecName: "config-data") pod "adf409d4-e93e-4a5d-8161-60841cb2e21f" (UID: "adf409d4-e93e-4a5d-8161-60841cb2e21f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.702578 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf409d4-e93e-4a5d-8161-60841cb2e21f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "adf409d4-e93e-4a5d-8161-60841cb2e21f" (UID: "adf409d4-e93e-4a5d-8161-60841cb2e21f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.764690 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf409d4-e93e-4a5d-8161-60841cb2e21f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.764728 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhg4q\" (UniqueName: \"kubernetes.io/projected/adf409d4-e93e-4a5d-8161-60841cb2e21f-kube-api-access-fhg4q\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.764740 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adf409d4-e93e-4a5d-8161-60841cb2e21f-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.955787 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.965739 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.981686 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:00 crc kubenswrapper[4745]: E1209 11:56:00.982152 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf409d4-e93e-4a5d-8161-60841cb2e21f" containerName="nova-metadata-log" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.982172 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf409d4-e93e-4a5d-8161-60841cb2e21f" containerName="nova-metadata-log" Dec 09 11:56:00 crc kubenswrapper[4745]: E1209 11:56:00.982201 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf409d4-e93e-4a5d-8161-60841cb2e21f" containerName="nova-metadata-metadata" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.982207 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf409d4-e93e-4a5d-8161-60841cb2e21f" containerName="nova-metadata-metadata" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.982403 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf409d4-e93e-4a5d-8161-60841cb2e21f" containerName="nova-metadata-log" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.982441 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf409d4-e93e-4a5d-8161-60841cb2e21f" containerName="nova-metadata-metadata" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.983565 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.993855 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 09 11:56:00 crc kubenswrapper[4745]: I1209 11:56:00.994587 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.007259 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.070483 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " pod="openstack/nova-metadata-0" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.070609 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-config-data\") pod \"nova-metadata-0\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " pod="openstack/nova-metadata-0" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.071066 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edb70f4a-bb9e-44ad-845e-e7f85417803f-logs\") pod \"nova-metadata-0\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " pod="openstack/nova-metadata-0" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.071437 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " pod="openstack/nova-metadata-0" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.071646 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xngms\" (UniqueName: \"kubernetes.io/projected/edb70f4a-bb9e-44ad-845e-e7f85417803f-kube-api-access-xngms\") pod \"nova-metadata-0\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " pod="openstack/nova-metadata-0" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.174435 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " pod="openstack/nova-metadata-0" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.174580 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xngms\" (UniqueName: \"kubernetes.io/projected/edb70f4a-bb9e-44ad-845e-e7f85417803f-kube-api-access-xngms\") pod \"nova-metadata-0\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " pod="openstack/nova-metadata-0" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.174618 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " pod="openstack/nova-metadata-0" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.174673 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-config-data\") pod \"nova-metadata-0\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " pod="openstack/nova-metadata-0" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.174768 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edb70f4a-bb9e-44ad-845e-e7f85417803f-logs\") pod \"nova-metadata-0\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " pod="openstack/nova-metadata-0" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.175360 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edb70f4a-bb9e-44ad-845e-e7f85417803f-logs\") pod \"nova-metadata-0\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " pod="openstack/nova-metadata-0" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.181375 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " pod="openstack/nova-metadata-0" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.186244 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " pod="openstack/nova-metadata-0" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.192667 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-config-data\") pod \"nova-metadata-0\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " pod="openstack/nova-metadata-0" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.197910 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xngms\" (UniqueName: \"kubernetes.io/projected/edb70f4a-bb9e-44ad-845e-e7f85417803f-kube-api-access-xngms\") pod \"nova-metadata-0\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " pod="openstack/nova-metadata-0" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.302328 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.576934 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adf409d4-e93e-4a5d-8161-60841cb2e21f" path="/var/lib/kubelet/pods/adf409d4-e93e-4a5d-8161-60841cb2e21f/volumes" Dec 09 11:56:01 crc kubenswrapper[4745]: I1209 11:56:01.846472 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:01 crc kubenswrapper[4745]: W1209 11:56:01.861290 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedb70f4a_bb9e_44ad_845e_e7f85417803f.slice/crio-f13e73c573dac20dc572d95d7bdb73615049f20bfc8d38dca5549be8044ee100 WatchSource:0}: Error finding container f13e73c573dac20dc572d95d7bdb73615049f20bfc8d38dca5549be8044ee100: Status 404 returned error can't find the container with id f13e73c573dac20dc572d95d7bdb73615049f20bfc8d38dca5549be8044ee100 Dec 09 11:56:02 crc kubenswrapper[4745]: I1209 11:56:02.647055 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"edb70f4a-bb9e-44ad-845e-e7f85417803f","Type":"ContainerStarted","Data":"ad0856e2bb7d8ad56be2fc1a0a3e1aa44622fe1ed9bd86c94c6f2036ade2703f"} Dec 09 11:56:02 crc kubenswrapper[4745]: I1209 11:56:02.647438 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"edb70f4a-bb9e-44ad-845e-e7f85417803f","Type":"ContainerStarted","Data":"d131f1eefcee52d195b728e691dd6b03a66eb7ab6fa5d5daaa762fc39fcf0fc4"} Dec 09 11:56:02 crc kubenswrapper[4745]: I1209 11:56:02.647457 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"edb70f4a-bb9e-44ad-845e-e7f85417803f","Type":"ContainerStarted","Data":"f13e73c573dac20dc572d95d7bdb73615049f20bfc8d38dca5549be8044ee100"} Dec 09 11:56:02 crc kubenswrapper[4745]: I1209 11:56:02.667929 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.6679002069999997 podStartE2EDuration="2.667900207s" podCreationTimestamp="2025-12-09 11:56:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:56:02.667702872 +0000 UTC m=+1449.492904396" watchObservedRunningTime="2025-12-09 11:56:02.667900207 +0000 UTC m=+1449.493101731" Dec 09 11:56:02 crc kubenswrapper[4745]: I1209 11:56:02.820596 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 11:56:02 crc kubenswrapper[4745]: I1209 11:56:02.820673 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 11:56:02 crc kubenswrapper[4745]: I1209 11:56:02.958172 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 11:56:02 crc kubenswrapper[4745]: I1209 11:56:02.958244 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 11:56:02 crc kubenswrapper[4745]: I1209 11:56:02.992975 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 11:56:03 crc kubenswrapper[4745]: I1209 11:56:03.270709 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:03 crc kubenswrapper[4745]: I1209 11:56:03.573712 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:56:03 crc kubenswrapper[4745]: I1209 11:56:03.700296 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b4f5fc4f-2c4hw"] Dec 09 11:56:03 crc kubenswrapper[4745]: I1209 11:56:03.701261 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" podUID="ad70250d-1392-4c1f-b661-09709cb9d7b0" containerName="dnsmasq-dns" containerID="cri-o://32df404b009804b0eefbb14355816df8c172aa6a4f3a9a44a75807934619ac4a" gracePeriod=10 Dec 09 11:56:03 crc kubenswrapper[4745]: I1209 11:56:03.734476 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 11:56:03 crc kubenswrapper[4745]: E1209 11:56:03.888023 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad70250d_1392_4c1f_b661_09709cb9d7b0.slice/crio-32df404b009804b0eefbb14355816df8c172aa6a4f3a9a44a75807934619ac4a.scope\": RecentStats: unable to find data in memory cache]" Dec 09 11:56:03 crc kubenswrapper[4745]: I1209 11:56:03.910750 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8aef32c3-9850-494c-a370-3207772e6d07" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:56:03 crc kubenswrapper[4745]: I1209 11:56:03.911204 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8aef32c3-9850-494c-a370-3207772e6d07" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.288449 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.448598 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxtxq\" (UniqueName: \"kubernetes.io/projected/ad70250d-1392-4c1f-b661-09709cb9d7b0-kube-api-access-zxtxq\") pod \"ad70250d-1392-4c1f-b661-09709cb9d7b0\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.448697 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-ovsdbserver-nb\") pod \"ad70250d-1392-4c1f-b661-09709cb9d7b0\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.448763 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-config\") pod \"ad70250d-1392-4c1f-b661-09709cb9d7b0\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.448812 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-dns-swift-storage-0\") pod \"ad70250d-1392-4c1f-b661-09709cb9d7b0\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.448880 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-dns-svc\") pod \"ad70250d-1392-4c1f-b661-09709cb9d7b0\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.448923 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-ovsdbserver-sb\") pod \"ad70250d-1392-4c1f-b661-09709cb9d7b0\" (UID: \"ad70250d-1392-4c1f-b661-09709cb9d7b0\") " Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.472805 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad70250d-1392-4c1f-b661-09709cb9d7b0-kube-api-access-zxtxq" (OuterVolumeSpecName: "kube-api-access-zxtxq") pod "ad70250d-1392-4c1f-b661-09709cb9d7b0" (UID: "ad70250d-1392-4c1f-b661-09709cb9d7b0"). InnerVolumeSpecName "kube-api-access-zxtxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.556570 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxtxq\" (UniqueName: \"kubernetes.io/projected/ad70250d-1392-4c1f-b661-09709cb9d7b0-kube-api-access-zxtxq\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.576257 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ad70250d-1392-4c1f-b661-09709cb9d7b0" (UID: "ad70250d-1392-4c1f-b661-09709cb9d7b0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.616218 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-config" (OuterVolumeSpecName: "config") pod "ad70250d-1392-4c1f-b661-09709cb9d7b0" (UID: "ad70250d-1392-4c1f-b661-09709cb9d7b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.619665 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ad70250d-1392-4c1f-b661-09709cb9d7b0" (UID: "ad70250d-1392-4c1f-b661-09709cb9d7b0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.628853 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ad70250d-1392-4c1f-b661-09709cb9d7b0" (UID: "ad70250d-1392-4c1f-b661-09709cb9d7b0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.629231 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad70250d-1392-4c1f-b661-09709cb9d7b0" (UID: "ad70250d-1392-4c1f-b661-09709cb9d7b0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.699173 4745 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.699900 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.700494 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.700536 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.700548 4745 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad70250d-1392-4c1f-b661-09709cb9d7b0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.713157 4745 generic.go:334] "Generic (PLEG): container finished" podID="d34f458f-6a4f-416a-96b9-18dfd1bb1452" containerID="71590042ec928a9bfe46dae653af9e69e9269e11cd2b38ede0e4c60102844e9d" exitCode=0 Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.713219 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-r7bpq" event={"ID":"d34f458f-6a4f-416a-96b9-18dfd1bb1452","Type":"ContainerDied","Data":"71590042ec928a9bfe46dae653af9e69e9269e11cd2b38ede0e4c60102844e9d"} Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.718145 4745 generic.go:334] "Generic (PLEG): container finished" podID="ad70250d-1392-4c1f-b661-09709cb9d7b0" containerID="32df404b009804b0eefbb14355816df8c172aa6a4f3a9a44a75807934619ac4a" exitCode=0 Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.719350 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.720811 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" event={"ID":"ad70250d-1392-4c1f-b661-09709cb9d7b0","Type":"ContainerDied","Data":"32df404b009804b0eefbb14355816df8c172aa6a4f3a9a44a75807934619ac4a"} Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.720873 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4f5fc4f-2c4hw" event={"ID":"ad70250d-1392-4c1f-b661-09709cb9d7b0","Type":"ContainerDied","Data":"804aa3fc361d35883323f0a41d9e0ac6467b291cd2b36c6cc5de42e1b4c78913"} Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.720900 4745 scope.go:117] "RemoveContainer" containerID="32df404b009804b0eefbb14355816df8c172aa6a4f3a9a44a75807934619ac4a" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.774294 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b4f5fc4f-2c4hw"] Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.787989 4745 scope.go:117] "RemoveContainer" containerID="2dace4bf90f864726e9d6d208dc0ce8d2d17d003c57ae4ae3439365dc03ebce2" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.788939 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b4f5fc4f-2c4hw"] Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.811677 4745 scope.go:117] "RemoveContainer" containerID="32df404b009804b0eefbb14355816df8c172aa6a4f3a9a44a75807934619ac4a" Dec 09 11:56:04 crc kubenswrapper[4745]: E1209 11:56:04.812623 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32df404b009804b0eefbb14355816df8c172aa6a4f3a9a44a75807934619ac4a\": container with ID starting with 32df404b009804b0eefbb14355816df8c172aa6a4f3a9a44a75807934619ac4a not found: ID does not exist" containerID="32df404b009804b0eefbb14355816df8c172aa6a4f3a9a44a75807934619ac4a" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.812809 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32df404b009804b0eefbb14355816df8c172aa6a4f3a9a44a75807934619ac4a"} err="failed to get container status \"32df404b009804b0eefbb14355816df8c172aa6a4f3a9a44a75807934619ac4a\": rpc error: code = NotFound desc = could not find container \"32df404b009804b0eefbb14355816df8c172aa6a4f3a9a44a75807934619ac4a\": container with ID starting with 32df404b009804b0eefbb14355816df8c172aa6a4f3a9a44a75807934619ac4a not found: ID does not exist" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.813576 4745 scope.go:117] "RemoveContainer" containerID="2dace4bf90f864726e9d6d208dc0ce8d2d17d003c57ae4ae3439365dc03ebce2" Dec 09 11:56:04 crc kubenswrapper[4745]: E1209 11:56:04.814121 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dace4bf90f864726e9d6d208dc0ce8d2d17d003c57ae4ae3439365dc03ebce2\": container with ID starting with 2dace4bf90f864726e9d6d208dc0ce8d2d17d003c57ae4ae3439365dc03ebce2 not found: ID does not exist" containerID="2dace4bf90f864726e9d6d208dc0ce8d2d17d003c57ae4ae3439365dc03ebce2" Dec 09 11:56:04 crc kubenswrapper[4745]: I1209 11:56:04.814151 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dace4bf90f864726e9d6d208dc0ce8d2d17d003c57ae4ae3439365dc03ebce2"} err="failed to get container status \"2dace4bf90f864726e9d6d208dc0ce8d2d17d003c57ae4ae3439365dc03ebce2\": rpc error: code = NotFound desc = could not find container \"2dace4bf90f864726e9d6d208dc0ce8d2d17d003c57ae4ae3439365dc03ebce2\": container with ID starting with 2dace4bf90f864726e9d6d208dc0ce8d2d17d003c57ae4ae3439365dc03ebce2 not found: ID does not exist" Dec 09 11:56:05 crc kubenswrapper[4745]: I1209 11:56:05.569581 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad70250d-1392-4c1f-b661-09709cb9d7b0" path="/var/lib/kubelet/pods/ad70250d-1392-4c1f-b661-09709cb9d7b0/volumes" Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.103936 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-r7bpq" Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.130177 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw2mg\" (UniqueName: \"kubernetes.io/projected/d34f458f-6a4f-416a-96b9-18dfd1bb1452-kube-api-access-dw2mg\") pod \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\" (UID: \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\") " Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.131088 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-config-data\") pod \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\" (UID: \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\") " Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.131179 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-combined-ca-bundle\") pod \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\" (UID: \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\") " Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.131319 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-scripts\") pod \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\" (UID: \"d34f458f-6a4f-416a-96b9-18dfd1bb1452\") " Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.136189 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-scripts" (OuterVolumeSpecName: "scripts") pod "d34f458f-6a4f-416a-96b9-18dfd1bb1452" (UID: "d34f458f-6a4f-416a-96b9-18dfd1bb1452"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.136296 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34f458f-6a4f-416a-96b9-18dfd1bb1452-kube-api-access-dw2mg" (OuterVolumeSpecName: "kube-api-access-dw2mg") pod "d34f458f-6a4f-416a-96b9-18dfd1bb1452" (UID: "d34f458f-6a4f-416a-96b9-18dfd1bb1452"). InnerVolumeSpecName "kube-api-access-dw2mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.160828 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-config-data" (OuterVolumeSpecName: "config-data") pod "d34f458f-6a4f-416a-96b9-18dfd1bb1452" (UID: "d34f458f-6a4f-416a-96b9-18dfd1bb1452"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.174063 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d34f458f-6a4f-416a-96b9-18dfd1bb1452" (UID: "d34f458f-6a4f-416a-96b9-18dfd1bb1452"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.233779 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw2mg\" (UniqueName: \"kubernetes.io/projected/d34f458f-6a4f-416a-96b9-18dfd1bb1452-kube-api-access-dw2mg\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.233813 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.233822 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.233830 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d34f458f-6a4f-416a-96b9-18dfd1bb1452-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.303353 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.303429 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.746305 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-r7bpq" event={"ID":"d34f458f-6a4f-416a-96b9-18dfd1bb1452","Type":"ContainerDied","Data":"9cded2cc4bbf356e71100d148f6eacb6bdba2dcaa5cfc01cb20752fbb2bdde35"} Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.746687 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cded2cc4bbf356e71100d148f6eacb6bdba2dcaa5cfc01cb20752fbb2bdde35" Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.746772 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-r7bpq" Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.850842 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.851204 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8aef32c3-9850-494c-a370-3207772e6d07" containerName="nova-api-log" containerID="cri-o://fd3eec2bc5053cbd1956cb5232c9b61af79359fb2f0cdcd9ed10fed05fb00cd8" gracePeriod=30 Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.851297 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8aef32c3-9850-494c-a370-3207772e6d07" containerName="nova-api-api" containerID="cri-o://eaf018e0afd5899426dc1101e9d098617d96d7dd35536e09500578ce4eeeba26" gracePeriod=30 Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.872875 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.881544 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ad74d3e0-2e9c-4caa-9472-1c7a1faba83b" containerName="nova-scheduler-scheduler" containerID="cri-o://67909808c01b9d507ca5559c6b0f6e7493fcfc0215e858aff334671f71d7be18" gracePeriod=30 Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.898259 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.898563 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="edb70f4a-bb9e-44ad-845e-e7f85417803f" containerName="nova-metadata-log" containerID="cri-o://d131f1eefcee52d195b728e691dd6b03a66eb7ab6fa5d5daaa762fc39fcf0fc4" gracePeriod=30 Dec 09 11:56:06 crc kubenswrapper[4745]: I1209 11:56:06.898727 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="edb70f4a-bb9e-44ad-845e-e7f85417803f" containerName="nova-metadata-metadata" containerID="cri-o://ad0856e2bb7d8ad56be2fc1a0a3e1aa44622fe1ed9bd86c94c6f2036ade2703f" gracePeriod=30 Dec 09 11:56:07 crc kubenswrapper[4745]: I1209 11:56:07.763963 4745 generic.go:334] "Generic (PLEG): container finished" podID="8aef32c3-9850-494c-a370-3207772e6d07" containerID="fd3eec2bc5053cbd1956cb5232c9b61af79359fb2f0cdcd9ed10fed05fb00cd8" exitCode=143 Dec 09 11:56:07 crc kubenswrapper[4745]: I1209 11:56:07.764036 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aef32c3-9850-494c-a370-3207772e6d07","Type":"ContainerDied","Data":"fd3eec2bc5053cbd1956cb5232c9b61af79359fb2f0cdcd9ed10fed05fb00cd8"} Dec 09 11:56:07 crc kubenswrapper[4745]: I1209 11:56:07.767149 4745 generic.go:334] "Generic (PLEG): container finished" podID="edb70f4a-bb9e-44ad-845e-e7f85417803f" containerID="ad0856e2bb7d8ad56be2fc1a0a3e1aa44622fe1ed9bd86c94c6f2036ade2703f" exitCode=0 Dec 09 11:56:07 crc kubenswrapper[4745]: I1209 11:56:07.767174 4745 generic.go:334] "Generic (PLEG): container finished" podID="edb70f4a-bb9e-44ad-845e-e7f85417803f" containerID="d131f1eefcee52d195b728e691dd6b03a66eb7ab6fa5d5daaa762fc39fcf0fc4" exitCode=143 Dec 09 11:56:07 crc kubenswrapper[4745]: I1209 11:56:07.767189 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"edb70f4a-bb9e-44ad-845e-e7f85417803f","Type":"ContainerDied","Data":"ad0856e2bb7d8ad56be2fc1a0a3e1aa44622fe1ed9bd86c94c6f2036ade2703f"} Dec 09 11:56:07 crc kubenswrapper[4745]: I1209 11:56:07.767221 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"edb70f4a-bb9e-44ad-845e-e7f85417803f","Type":"ContainerDied","Data":"d131f1eefcee52d195b728e691dd6b03a66eb7ab6fa5d5daaa762fc39fcf0fc4"} Dec 09 11:56:07 crc kubenswrapper[4745]: E1209 11:56:07.961122 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67909808c01b9d507ca5559c6b0f6e7493fcfc0215e858aff334671f71d7be18" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:56:07 crc kubenswrapper[4745]: E1209 11:56:07.964991 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67909808c01b9d507ca5559c6b0f6e7493fcfc0215e858aff334671f71d7be18" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:56:07 crc kubenswrapper[4745]: E1209 11:56:07.966297 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67909808c01b9d507ca5559c6b0f6e7493fcfc0215e858aff334671f71d7be18" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:56:07 crc kubenswrapper[4745]: E1209 11:56:07.966389 4745 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ad74d3e0-2e9c-4caa-9472-1c7a1faba83b" containerName="nova-scheduler-scheduler" Dec 09 11:56:07 crc kubenswrapper[4745]: I1209 11:56:07.993831 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.117965 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-config-data\") pod \"edb70f4a-bb9e-44ad-845e-e7f85417803f\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.118251 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-combined-ca-bundle\") pod \"edb70f4a-bb9e-44ad-845e-e7f85417803f\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.118418 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edb70f4a-bb9e-44ad-845e-e7f85417803f-logs\") pod \"edb70f4a-bb9e-44ad-845e-e7f85417803f\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.118498 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xngms\" (UniqueName: \"kubernetes.io/projected/edb70f4a-bb9e-44ad-845e-e7f85417803f-kube-api-access-xngms\") pod \"edb70f4a-bb9e-44ad-845e-e7f85417803f\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.118577 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-nova-metadata-tls-certs\") pod \"edb70f4a-bb9e-44ad-845e-e7f85417803f\" (UID: \"edb70f4a-bb9e-44ad-845e-e7f85417803f\") " Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.118774 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edb70f4a-bb9e-44ad-845e-e7f85417803f-logs" (OuterVolumeSpecName: "logs") pod "edb70f4a-bb9e-44ad-845e-e7f85417803f" (UID: "edb70f4a-bb9e-44ad-845e-e7f85417803f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.119272 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edb70f4a-bb9e-44ad-845e-e7f85417803f-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.127776 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edb70f4a-bb9e-44ad-845e-e7f85417803f-kube-api-access-xngms" (OuterVolumeSpecName: "kube-api-access-xngms") pod "edb70f4a-bb9e-44ad-845e-e7f85417803f" (UID: "edb70f4a-bb9e-44ad-845e-e7f85417803f"). InnerVolumeSpecName "kube-api-access-xngms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.148538 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edb70f4a-bb9e-44ad-845e-e7f85417803f" (UID: "edb70f4a-bb9e-44ad-845e-e7f85417803f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.150658 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-config-data" (OuterVolumeSpecName: "config-data") pod "edb70f4a-bb9e-44ad-845e-e7f85417803f" (UID: "edb70f4a-bb9e-44ad-845e-e7f85417803f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.176821 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "edb70f4a-bb9e-44ad-845e-e7f85417803f" (UID: "edb70f4a-bb9e-44ad-845e-e7f85417803f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.221834 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.221890 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.221913 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xngms\" (UniqueName: \"kubernetes.io/projected/edb70f4a-bb9e-44ad-845e-e7f85417803f-kube-api-access-xngms\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.221932 4745 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/edb70f4a-bb9e-44ad-845e-e7f85417803f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.780673 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"edb70f4a-bb9e-44ad-845e-e7f85417803f","Type":"ContainerDied","Data":"f13e73c573dac20dc572d95d7bdb73615049f20bfc8d38dca5549be8044ee100"} Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.780747 4745 scope.go:117] "RemoveContainer" containerID="ad0856e2bb7d8ad56be2fc1a0a3e1aa44622fe1ed9bd86c94c6f2036ade2703f" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.780755 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.811273 4745 scope.go:117] "RemoveContainer" containerID="d131f1eefcee52d195b728e691dd6b03a66eb7ab6fa5d5daaa762fc39fcf0fc4" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.823679 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.843739 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.855697 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:08 crc kubenswrapper[4745]: E1209 11:56:08.856229 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad70250d-1392-4c1f-b661-09709cb9d7b0" containerName="init" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.856253 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad70250d-1392-4c1f-b661-09709cb9d7b0" containerName="init" Dec 09 11:56:08 crc kubenswrapper[4745]: E1209 11:56:08.856274 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34f458f-6a4f-416a-96b9-18dfd1bb1452" containerName="nova-manage" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.856282 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34f458f-6a4f-416a-96b9-18dfd1bb1452" containerName="nova-manage" Dec 09 11:56:08 crc kubenswrapper[4745]: E1209 11:56:08.856298 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb70f4a-bb9e-44ad-845e-e7f85417803f" containerName="nova-metadata-metadata" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.856305 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb70f4a-bb9e-44ad-845e-e7f85417803f" containerName="nova-metadata-metadata" Dec 09 11:56:08 crc kubenswrapper[4745]: E1209 11:56:08.856317 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb70f4a-bb9e-44ad-845e-e7f85417803f" containerName="nova-metadata-log" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.856323 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb70f4a-bb9e-44ad-845e-e7f85417803f" containerName="nova-metadata-log" Dec 09 11:56:08 crc kubenswrapper[4745]: E1209 11:56:08.856340 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad70250d-1392-4c1f-b661-09709cb9d7b0" containerName="dnsmasq-dns" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.856348 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad70250d-1392-4c1f-b661-09709cb9d7b0" containerName="dnsmasq-dns" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.856560 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34f458f-6a4f-416a-96b9-18dfd1bb1452" containerName="nova-manage" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.856576 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad70250d-1392-4c1f-b661-09709cb9d7b0" containerName="dnsmasq-dns" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.856591 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb70f4a-bb9e-44ad-845e-e7f85417803f" containerName="nova-metadata-metadata" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.856605 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb70f4a-bb9e-44ad-845e-e7f85417803f" containerName="nova-metadata-log" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.857925 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.865170 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.866182 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 09 11:56:08 crc kubenswrapper[4745]: I1209 11:56:08.870224 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.037616 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-config-data\") pod \"nova-metadata-0\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " pod="openstack/nova-metadata-0" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.037759 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " pod="openstack/nova-metadata-0" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.037824 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " pod="openstack/nova-metadata-0" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.037976 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dadcda89-9dcd-479b-a3d2-ef0422f992c8-logs\") pod \"nova-metadata-0\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " pod="openstack/nova-metadata-0" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.038140 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcjfp\" (UniqueName: \"kubernetes.io/projected/dadcda89-9dcd-479b-a3d2-ef0422f992c8-kube-api-access-gcjfp\") pod \"nova-metadata-0\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " pod="openstack/nova-metadata-0" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.140438 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dadcda89-9dcd-479b-a3d2-ef0422f992c8-logs\") pod \"nova-metadata-0\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " pod="openstack/nova-metadata-0" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.140551 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcjfp\" (UniqueName: \"kubernetes.io/projected/dadcda89-9dcd-479b-a3d2-ef0422f992c8-kube-api-access-gcjfp\") pod \"nova-metadata-0\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " pod="openstack/nova-metadata-0" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.140646 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-config-data\") pod \"nova-metadata-0\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " pod="openstack/nova-metadata-0" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.140676 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " pod="openstack/nova-metadata-0" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.140702 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " pod="openstack/nova-metadata-0" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.141147 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dadcda89-9dcd-479b-a3d2-ef0422f992c8-logs\") pod \"nova-metadata-0\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " pod="openstack/nova-metadata-0" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.145040 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " pod="openstack/nova-metadata-0" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.148238 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " pod="openstack/nova-metadata-0" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.155869 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-config-data\") pod \"nova-metadata-0\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " pod="openstack/nova-metadata-0" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.160390 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcjfp\" (UniqueName: \"kubernetes.io/projected/dadcda89-9dcd-479b-a3d2-ef0422f992c8-kube-api-access-gcjfp\") pod \"nova-metadata-0\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " pod="openstack/nova-metadata-0" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.188958 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.568125 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edb70f4a-bb9e-44ad-845e-e7f85417803f" path="/var/lib/kubelet/pods/edb70f4a-bb9e-44ad-845e-e7f85417803f/volumes" Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.630876 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:09 crc kubenswrapper[4745]: I1209 11:56:09.805377 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dadcda89-9dcd-479b-a3d2-ef0422f992c8","Type":"ContainerStarted","Data":"a8c82ae0674b271440193e8c7a827faf631ac270ac83e8d9c17cf755747b1a23"} Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.523253 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.681045 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aef32c3-9850-494c-a370-3207772e6d07-combined-ca-bundle\") pod \"8aef32c3-9850-494c-a370-3207772e6d07\" (UID: \"8aef32c3-9850-494c-a370-3207772e6d07\") " Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.681691 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgwpz\" (UniqueName: \"kubernetes.io/projected/8aef32c3-9850-494c-a370-3207772e6d07-kube-api-access-mgwpz\") pod \"8aef32c3-9850-494c-a370-3207772e6d07\" (UID: \"8aef32c3-9850-494c-a370-3207772e6d07\") " Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.681734 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aef32c3-9850-494c-a370-3207772e6d07-config-data\") pod \"8aef32c3-9850-494c-a370-3207772e6d07\" (UID: \"8aef32c3-9850-494c-a370-3207772e6d07\") " Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.681836 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aef32c3-9850-494c-a370-3207772e6d07-logs\") pod \"8aef32c3-9850-494c-a370-3207772e6d07\" (UID: \"8aef32c3-9850-494c-a370-3207772e6d07\") " Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.682783 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aef32c3-9850-494c-a370-3207772e6d07-logs" (OuterVolumeSpecName: "logs") pod "8aef32c3-9850-494c-a370-3207772e6d07" (UID: "8aef32c3-9850-494c-a370-3207772e6d07"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.690998 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aef32c3-9850-494c-a370-3207772e6d07-kube-api-access-mgwpz" (OuterVolumeSpecName: "kube-api-access-mgwpz") pod "8aef32c3-9850-494c-a370-3207772e6d07" (UID: "8aef32c3-9850-494c-a370-3207772e6d07"). InnerVolumeSpecName "kube-api-access-mgwpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.717793 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aef32c3-9850-494c-a370-3207772e6d07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8aef32c3-9850-494c-a370-3207772e6d07" (UID: "8aef32c3-9850-494c-a370-3207772e6d07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.729627 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aef32c3-9850-494c-a370-3207772e6d07-config-data" (OuterVolumeSpecName: "config-data") pod "8aef32c3-9850-494c-a370-3207772e6d07" (UID: "8aef32c3-9850-494c-a370-3207772e6d07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.783959 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aef32c3-9850-494c-a370-3207772e6d07-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.783992 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aef32c3-9850-494c-a370-3207772e6d07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.784002 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgwpz\" (UniqueName: \"kubernetes.io/projected/8aef32c3-9850-494c-a370-3207772e6d07-kube-api-access-mgwpz\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.784014 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aef32c3-9850-494c-a370-3207772e6d07-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.816823 4745 generic.go:334] "Generic (PLEG): container finished" podID="8df4272b-8d98-4788-8513-1f3ab014775c" containerID="38870d77b6d142f1c1cade48e3d884f4d22c05a1003d9c79fc17066cda6fa023" exitCode=0 Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.816892 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zd8rf" event={"ID":"8df4272b-8d98-4788-8513-1f3ab014775c","Type":"ContainerDied","Data":"38870d77b6d142f1c1cade48e3d884f4d22c05a1003d9c79fc17066cda6fa023"} Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.821584 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dadcda89-9dcd-479b-a3d2-ef0422f992c8","Type":"ContainerStarted","Data":"11df3c618a85a65ff7e238affe01b3c177f9c30ad986c9484fb0ea61308590c3"} Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.822028 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dadcda89-9dcd-479b-a3d2-ef0422f992c8","Type":"ContainerStarted","Data":"9416585f06086a9bb4c7aa5fd6561a97084403b00568331dbddc6bc7ed178be6"} Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.841418 4745 generic.go:334] "Generic (PLEG): container finished" podID="ad74d3e0-2e9c-4caa-9472-1c7a1faba83b" containerID="67909808c01b9d507ca5559c6b0f6e7493fcfc0215e858aff334671f71d7be18" exitCode=0 Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.841605 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b","Type":"ContainerDied","Data":"67909808c01b9d507ca5559c6b0f6e7493fcfc0215e858aff334671f71d7be18"} Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.856642 4745 generic.go:334] "Generic (PLEG): container finished" podID="8aef32c3-9850-494c-a370-3207772e6d07" containerID="eaf018e0afd5899426dc1101e9d098617d96d7dd35536e09500578ce4eeeba26" exitCode=0 Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.856741 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aef32c3-9850-494c-a370-3207772e6d07","Type":"ContainerDied","Data":"eaf018e0afd5899426dc1101e9d098617d96d7dd35536e09500578ce4eeeba26"} Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.856830 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aef32c3-9850-494c-a370-3207772e6d07","Type":"ContainerDied","Data":"c62c3ebd05e2cd2ed9713fa2816f00d8063a54b7665aa30cdb7898dade1e8402"} Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.856853 4745 scope.go:117] "RemoveContainer" containerID="eaf018e0afd5899426dc1101e9d098617d96d7dd35536e09500578ce4eeeba26" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.856882 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.897299 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.897274886 podStartE2EDuration="2.897274886s" podCreationTimestamp="2025-12-09 11:56:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:56:10.877618706 +0000 UTC m=+1457.702820230" watchObservedRunningTime="2025-12-09 11:56:10.897274886 +0000 UTC m=+1457.722476410" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.918299 4745 scope.go:117] "RemoveContainer" containerID="fd3eec2bc5053cbd1956cb5232c9b61af79359fb2f0cdcd9ed10fed05fb00cd8" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.927170 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.950536 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.956056 4745 scope.go:117] "RemoveContainer" containerID="eaf018e0afd5899426dc1101e9d098617d96d7dd35536e09500578ce4eeeba26" Dec 09 11:56:10 crc kubenswrapper[4745]: E1209 11:56:10.956887 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaf018e0afd5899426dc1101e9d098617d96d7dd35536e09500578ce4eeeba26\": container with ID starting with eaf018e0afd5899426dc1101e9d098617d96d7dd35536e09500578ce4eeeba26 not found: ID does not exist" containerID="eaf018e0afd5899426dc1101e9d098617d96d7dd35536e09500578ce4eeeba26" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.956934 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaf018e0afd5899426dc1101e9d098617d96d7dd35536e09500578ce4eeeba26"} err="failed to get container status \"eaf018e0afd5899426dc1101e9d098617d96d7dd35536e09500578ce4eeeba26\": rpc error: code = NotFound desc = could not find container \"eaf018e0afd5899426dc1101e9d098617d96d7dd35536e09500578ce4eeeba26\": container with ID starting with eaf018e0afd5899426dc1101e9d098617d96d7dd35536e09500578ce4eeeba26 not found: ID does not exist" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.956964 4745 scope.go:117] "RemoveContainer" containerID="fd3eec2bc5053cbd1956cb5232c9b61af79359fb2f0cdcd9ed10fed05fb00cd8" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.958733 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 11:56:10 crc kubenswrapper[4745]: E1209 11:56:10.959285 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aef32c3-9850-494c-a370-3207772e6d07" containerName="nova-api-api" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.959306 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aef32c3-9850-494c-a370-3207772e6d07" containerName="nova-api-api" Dec 09 11:56:10 crc kubenswrapper[4745]: E1209 11:56:10.959329 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aef32c3-9850-494c-a370-3207772e6d07" containerName="nova-api-log" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.959334 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aef32c3-9850-494c-a370-3207772e6d07" containerName="nova-api-log" Dec 09 11:56:10 crc kubenswrapper[4745]: E1209 11:56:10.959365 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd3eec2bc5053cbd1956cb5232c9b61af79359fb2f0cdcd9ed10fed05fb00cd8\": container with ID starting with fd3eec2bc5053cbd1956cb5232c9b61af79359fb2f0cdcd9ed10fed05fb00cd8 not found: ID does not exist" containerID="fd3eec2bc5053cbd1956cb5232c9b61af79359fb2f0cdcd9ed10fed05fb00cd8" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.959430 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd3eec2bc5053cbd1956cb5232c9b61af79359fb2f0cdcd9ed10fed05fb00cd8"} err="failed to get container status \"fd3eec2bc5053cbd1956cb5232c9b61af79359fb2f0cdcd9ed10fed05fb00cd8\": rpc error: code = NotFound desc = could not find container \"fd3eec2bc5053cbd1956cb5232c9b61af79359fb2f0cdcd9ed10fed05fb00cd8\": container with ID starting with fd3eec2bc5053cbd1956cb5232c9b61af79359fb2f0cdcd9ed10fed05fb00cd8 not found: ID does not exist" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.959582 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aef32c3-9850-494c-a370-3207772e6d07" containerName="nova-api-log" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.959603 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aef32c3-9850-494c-a370-3207772e6d07" containerName="nova-api-api" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.960829 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.970124 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.970028 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.992328 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7710bc2-5cca-45ef-8464-8e02aa75adc2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\") " pod="openstack/nova-api-0" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.992473 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7710bc2-5cca-45ef-8464-8e02aa75adc2-logs\") pod \"nova-api-0\" (UID: \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\") " pod="openstack/nova-api-0" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.992534 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7710bc2-5cca-45ef-8464-8e02aa75adc2-config-data\") pod \"nova-api-0\" (UID: \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\") " pod="openstack/nova-api-0" Dec 09 11:56:10 crc kubenswrapper[4745]: I1209 11:56:10.992597 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbncb\" (UniqueName: \"kubernetes.io/projected/d7710bc2-5cca-45ef-8464-8e02aa75adc2-kube-api-access-pbncb\") pod \"nova-api-0\" (UID: \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\") " pod="openstack/nova-api-0" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.089981 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.095823 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbncb\" (UniqueName: \"kubernetes.io/projected/d7710bc2-5cca-45ef-8464-8e02aa75adc2-kube-api-access-pbncb\") pod \"nova-api-0\" (UID: \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\") " pod="openstack/nova-api-0" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.096110 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7710bc2-5cca-45ef-8464-8e02aa75adc2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\") " pod="openstack/nova-api-0" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.096313 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7710bc2-5cca-45ef-8464-8e02aa75adc2-logs\") pod \"nova-api-0\" (UID: \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\") " pod="openstack/nova-api-0" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.096401 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7710bc2-5cca-45ef-8464-8e02aa75adc2-config-data\") pod \"nova-api-0\" (UID: \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\") " pod="openstack/nova-api-0" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.097117 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7710bc2-5cca-45ef-8464-8e02aa75adc2-logs\") pod \"nova-api-0\" (UID: \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\") " pod="openstack/nova-api-0" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.101276 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7710bc2-5cca-45ef-8464-8e02aa75adc2-config-data\") pod \"nova-api-0\" (UID: \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\") " pod="openstack/nova-api-0" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.101660 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7710bc2-5cca-45ef-8464-8e02aa75adc2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\") " pod="openstack/nova-api-0" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.127229 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbncb\" (UniqueName: \"kubernetes.io/projected/d7710bc2-5cca-45ef-8464-8e02aa75adc2-kube-api-access-pbncb\") pod \"nova-api-0\" (UID: \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\") " pod="openstack/nova-api-0" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.198409 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-combined-ca-bundle\") pod \"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b\" (UID: \"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b\") " Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.198791 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-config-data\") pod \"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b\" (UID: \"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b\") " Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.198890 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cz6r\" (UniqueName: \"kubernetes.io/projected/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-kube-api-access-6cz6r\") pod \"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b\" (UID: \"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b\") " Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.202104 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-kube-api-access-6cz6r" (OuterVolumeSpecName: "kube-api-access-6cz6r") pod "ad74d3e0-2e9c-4caa-9472-1c7a1faba83b" (UID: "ad74d3e0-2e9c-4caa-9472-1c7a1faba83b"). InnerVolumeSpecName "kube-api-access-6cz6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.228768 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad74d3e0-2e9c-4caa-9472-1c7a1faba83b" (UID: "ad74d3e0-2e9c-4caa-9472-1c7a1faba83b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.231770 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-config-data" (OuterVolumeSpecName: "config-data") pod "ad74d3e0-2e9c-4caa-9472-1c7a1faba83b" (UID: "ad74d3e0-2e9c-4caa-9472-1c7a1faba83b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.294538 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.301649 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.301682 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.301696 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cz6r\" (UniqueName: \"kubernetes.io/projected/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b-kube-api-access-6cz6r\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.568953 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aef32c3-9850-494c-a370-3207772e6d07" path="/var/lib/kubelet/pods/8aef32c3-9850-494c-a370-3207772e6d07/volumes" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.872787 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad74d3e0-2e9c-4caa-9472-1c7a1faba83b","Type":"ContainerDied","Data":"14777599fa86159cc031c09c718e9680ca048cff1813b17c6f27f39ff7d14b6e"} Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.873337 4745 scope.go:117] "RemoveContainer" containerID="67909808c01b9d507ca5559c6b0f6e7493fcfc0215e858aff334671f71d7be18" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.872838 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.920227 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.942577 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.966599 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:56:11 crc kubenswrapper[4745]: E1209 11:56:11.967261 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad74d3e0-2e9c-4caa-9472-1c7a1faba83b" containerName="nova-scheduler-scheduler" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.967290 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad74d3e0-2e9c-4caa-9472-1c7a1faba83b" containerName="nova-scheduler-scheduler" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.967596 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad74d3e0-2e9c-4caa-9472-1c7a1faba83b" containerName="nova-scheduler-scheduler" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.968604 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.975249 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 11:56:11 crc kubenswrapper[4745]: I1209 11:56:11.996702 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.012201 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.031122 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bb33f9-ecd8-4960-8cc3-e9537509e71f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"65bb33f9-ecd8-4960-8cc3-e9537509e71f\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.031186 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bb33f9-ecd8-4960-8cc3-e9537509e71f-config-data\") pod \"nova-scheduler-0\" (UID: \"65bb33f9-ecd8-4960-8cc3-e9537509e71f\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.031429 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-827g8\" (UniqueName: \"kubernetes.io/projected/65bb33f9-ecd8-4960-8cc3-e9537509e71f-kube-api-access-827g8\") pod \"nova-scheduler-0\" (UID: \"65bb33f9-ecd8-4960-8cc3-e9537509e71f\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.134436 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bb33f9-ecd8-4960-8cc3-e9537509e71f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"65bb33f9-ecd8-4960-8cc3-e9537509e71f\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.134624 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bb33f9-ecd8-4960-8cc3-e9537509e71f-config-data\") pod \"nova-scheduler-0\" (UID: \"65bb33f9-ecd8-4960-8cc3-e9537509e71f\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.134724 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-827g8\" (UniqueName: \"kubernetes.io/projected/65bb33f9-ecd8-4960-8cc3-e9537509e71f-kube-api-access-827g8\") pod \"nova-scheduler-0\" (UID: \"65bb33f9-ecd8-4960-8cc3-e9537509e71f\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.144773 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bb33f9-ecd8-4960-8cc3-e9537509e71f-config-data\") pod \"nova-scheduler-0\" (UID: \"65bb33f9-ecd8-4960-8cc3-e9537509e71f\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.148707 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bb33f9-ecd8-4960-8cc3-e9537509e71f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"65bb33f9-ecd8-4960-8cc3-e9537509e71f\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.156607 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-827g8\" (UniqueName: \"kubernetes.io/projected/65bb33f9-ecd8-4960-8cc3-e9537509e71f-kube-api-access-827g8\") pod \"nova-scheduler-0\" (UID: \"65bb33f9-ecd8-4960-8cc3-e9537509e71f\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.279119 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zd8rf" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.304132 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.454978 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-combined-ca-bundle\") pod \"8df4272b-8d98-4788-8513-1f3ab014775c\" (UID: \"8df4272b-8d98-4788-8513-1f3ab014775c\") " Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.455108 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-scripts\") pod \"8df4272b-8d98-4788-8513-1f3ab014775c\" (UID: \"8df4272b-8d98-4788-8513-1f3ab014775c\") " Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.455408 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-config-data\") pod \"8df4272b-8d98-4788-8513-1f3ab014775c\" (UID: \"8df4272b-8d98-4788-8513-1f3ab014775c\") " Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.455449 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv7x5\" (UniqueName: \"kubernetes.io/projected/8df4272b-8d98-4788-8513-1f3ab014775c-kube-api-access-cv7x5\") pod \"8df4272b-8d98-4788-8513-1f3ab014775c\" (UID: \"8df4272b-8d98-4788-8513-1f3ab014775c\") " Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.472839 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-scripts" (OuterVolumeSpecName: "scripts") pod "8df4272b-8d98-4788-8513-1f3ab014775c" (UID: "8df4272b-8d98-4788-8513-1f3ab014775c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.486844 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8df4272b-8d98-4788-8513-1f3ab014775c-kube-api-access-cv7x5" (OuterVolumeSpecName: "kube-api-access-cv7x5") pod "8df4272b-8d98-4788-8513-1f3ab014775c" (UID: "8df4272b-8d98-4788-8513-1f3ab014775c"). InnerVolumeSpecName "kube-api-access-cv7x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.524842 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-config-data" (OuterVolumeSpecName: "config-data") pod "8df4272b-8d98-4788-8513-1f3ab014775c" (UID: "8df4272b-8d98-4788-8513-1f3ab014775c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.567827 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8df4272b-8d98-4788-8513-1f3ab014775c" (UID: "8df4272b-8d98-4788-8513-1f3ab014775c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.571373 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.571418 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv7x5\" (UniqueName: \"kubernetes.io/projected/8df4272b-8d98-4788-8513-1f3ab014775c-kube-api-access-cv7x5\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.571428 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.571436 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8df4272b-8d98-4788-8513-1f3ab014775c-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.886476 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7710bc2-5cca-45ef-8464-8e02aa75adc2","Type":"ContainerStarted","Data":"5e96df06f0c362e2167c371ab2282d459752dd30a40ae2aaa99fe0e740343b7e"} Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.886548 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7710bc2-5cca-45ef-8464-8e02aa75adc2","Type":"ContainerStarted","Data":"ab6fa1af4f7059722b7284aa0c0ffcb9e234cc96736c8edf3c78e4a735fb551e"} Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.886566 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7710bc2-5cca-45ef-8464-8e02aa75adc2","Type":"ContainerStarted","Data":"8a8a0e51324d52bd72bc55d6122c8f9d25b3d66d058e0ed21e9ac001e794f1e5"} Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.900600 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zd8rf" event={"ID":"8df4272b-8d98-4788-8513-1f3ab014775c","Type":"ContainerDied","Data":"2767ff3c1c7ea5568c740f5d2d9880053190b5e5f45eb7a090750c0fa3c93e54"} Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.900655 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2767ff3c1c7ea5568c740f5d2d9880053190b5e5f45eb7a090750c0fa3c93e54" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.900709 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zd8rf" Dec 09 11:56:12 crc kubenswrapper[4745]: I1209 11:56:12.930268 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.930206211 podStartE2EDuration="2.930206211s" podCreationTimestamp="2025-12-09 11:56:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:56:12.905157635 +0000 UTC m=+1459.730359169" watchObservedRunningTime="2025-12-09 11:56:12.930206211 +0000 UTC m=+1459.755407745" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.013236 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 11:56:13 crc kubenswrapper[4745]: E1209 11:56:13.013945 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4272b-8d98-4788-8513-1f3ab014775c" containerName="nova-cell1-conductor-db-sync" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.013968 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4272b-8d98-4788-8513-1f3ab014775c" containerName="nova-cell1-conductor-db-sync" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.014247 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4272b-8d98-4788-8513-1f3ab014775c" containerName="nova-cell1-conductor-db-sync" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.015229 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.017821 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.023933 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.031795 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.086010 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f786d16-8a6e-420b-b2b7-f785386e2191-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0f786d16-8a6e-420b-b2b7-f785386e2191\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.087342 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f786d16-8a6e-420b-b2b7-f785386e2191-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0f786d16-8a6e-420b-b2b7-f785386e2191\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.087402 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh4wv\" (UniqueName: \"kubernetes.io/projected/0f786d16-8a6e-420b-b2b7-f785386e2191-kube-api-access-xh4wv\") pod \"nova-cell1-conductor-0\" (UID: \"0f786d16-8a6e-420b-b2b7-f785386e2191\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.191159 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f786d16-8a6e-420b-b2b7-f785386e2191-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0f786d16-8a6e-420b-b2b7-f785386e2191\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.191350 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f786d16-8a6e-420b-b2b7-f785386e2191-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0f786d16-8a6e-420b-b2b7-f785386e2191\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.191430 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh4wv\" (UniqueName: \"kubernetes.io/projected/0f786d16-8a6e-420b-b2b7-f785386e2191-kube-api-access-xh4wv\") pod \"nova-cell1-conductor-0\" (UID: \"0f786d16-8a6e-420b-b2b7-f785386e2191\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.195998 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f786d16-8a6e-420b-b2b7-f785386e2191-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0f786d16-8a6e-420b-b2b7-f785386e2191\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.197236 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f786d16-8a6e-420b-b2b7-f785386e2191-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0f786d16-8a6e-420b-b2b7-f785386e2191\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.210354 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh4wv\" (UniqueName: \"kubernetes.io/projected/0f786d16-8a6e-420b-b2b7-f785386e2191-kube-api-access-xh4wv\") pod \"nova-cell1-conductor-0\" (UID: \"0f786d16-8a6e-420b-b2b7-f785386e2191\") " pod="openstack/nova-cell1-conductor-0" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.349230 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.594244 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad74d3e0-2e9c-4caa-9472-1c7a1faba83b" path="/var/lib/kubelet/pods/ad74d3e0-2e9c-4caa-9472-1c7a1faba83b/volumes" Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.879251 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 11:56:13 crc kubenswrapper[4745]: W1209 11:56:13.886296 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f786d16_8a6e_420b_b2b7_f785386e2191.slice/crio-0f99d043f80c96c0219013b5c63ecbffc0c921f477454b7d53e2643d3fd2ab70 WatchSource:0}: Error finding container 0f99d043f80c96c0219013b5c63ecbffc0c921f477454b7d53e2643d3fd2ab70: Status 404 returned error can't find the container with id 0f99d043f80c96c0219013b5c63ecbffc0c921f477454b7d53e2643d3fd2ab70 Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.930661 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0f786d16-8a6e-420b-b2b7-f785386e2191","Type":"ContainerStarted","Data":"0f99d043f80c96c0219013b5c63ecbffc0c921f477454b7d53e2643d3fd2ab70"} Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.933866 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"65bb33f9-ecd8-4960-8cc3-e9537509e71f","Type":"ContainerStarted","Data":"e440aff32fcaec3a8114366fee841731f1fbf5657009fb4945aed3566c407e31"} Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.933918 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"65bb33f9-ecd8-4960-8cc3-e9537509e71f","Type":"ContainerStarted","Data":"4417887b62a16c9e928d38063f358e137ba25d2c6c20b41dadfb02eff67c05a8"} Dec 09 11:56:13 crc kubenswrapper[4745]: I1209 11:56:13.965663 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.965625798 podStartE2EDuration="2.965625798s" podCreationTimestamp="2025-12-09 11:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:56:13.959825372 +0000 UTC m=+1460.785043716" watchObservedRunningTime="2025-12-09 11:56:13.965625798 +0000 UTC m=+1460.790827322" Dec 09 11:56:14 crc kubenswrapper[4745]: I1209 11:56:14.189089 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 11:56:14 crc kubenswrapper[4745]: I1209 11:56:14.189140 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 11:56:14 crc kubenswrapper[4745]: I1209 11:56:14.955956 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0f786d16-8a6e-420b-b2b7-f785386e2191","Type":"ContainerStarted","Data":"709a10ef722e6bd115a3fbe87e5cae6688e5855a823df3b981535f201d001778"} Dec 09 11:56:14 crc kubenswrapper[4745]: I1209 11:56:14.956589 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 09 11:56:14 crc kubenswrapper[4745]: I1209 11:56:14.989597 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.989567197 podStartE2EDuration="2.989567197s" podCreationTimestamp="2025-12-09 11:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:56:14.972697122 +0000 UTC m=+1461.797898636" watchObservedRunningTime="2025-12-09 11:56:14.989567197 +0000 UTC m=+1461.814768721" Dec 09 11:56:17 crc kubenswrapper[4745]: I1209 11:56:17.305570 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 11:56:18 crc kubenswrapper[4745]: I1209 11:56:18.685250 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 09 11:56:19 crc kubenswrapper[4745]: I1209 11:56:19.190171 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 11:56:19 crc kubenswrapper[4745]: I1209 11:56:19.190282 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 11:56:20 crc kubenswrapper[4745]: I1209 11:56:20.203896 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dadcda89-9dcd-479b-a3d2-ef0422f992c8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 09 11:56:20 crc kubenswrapper[4745]: I1209 11:56:20.204085 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dadcda89-9dcd-479b-a3d2-ef0422f992c8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 09 11:56:21 crc kubenswrapper[4745]: I1209 11:56:21.296577 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 11:56:21 crc kubenswrapper[4745]: I1209 11:56:21.297721 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 11:56:22 crc kubenswrapper[4745]: I1209 11:56:22.304578 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 11:56:22 crc kubenswrapper[4745]: I1209 11:56:22.340817 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 11:56:22 crc kubenswrapper[4745]: I1209 11:56:22.377852 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d7710bc2-5cca-45ef-8464-8e02aa75adc2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:56:22 crc kubenswrapper[4745]: I1209 11:56:22.377957 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d7710bc2-5cca-45ef-8464-8e02aa75adc2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 11:56:22 crc kubenswrapper[4745]: I1209 11:56:22.716642 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 11:56:22 crc kubenswrapper[4745]: I1209 11:56:22.716962 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d05e5297-a218-42d2-b46a-5c72201d96b4" containerName="kube-state-metrics" containerID="cri-o://523941c45b4eb75f71913c34b201f37a3dbf8c9e8afb7b7136a440e67eeffc4a" gracePeriod=30 Dec 09 11:56:23 crc kubenswrapper[4745]: I1209 11:56:23.065793 4745 generic.go:334] "Generic (PLEG): container finished" podID="d05e5297-a218-42d2-b46a-5c72201d96b4" containerID="523941c45b4eb75f71913c34b201f37a3dbf8c9e8afb7b7136a440e67eeffc4a" exitCode=2 Dec 09 11:56:23 crc kubenswrapper[4745]: I1209 11:56:23.067221 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d05e5297-a218-42d2-b46a-5c72201d96b4","Type":"ContainerDied","Data":"523941c45b4eb75f71913c34b201f37a3dbf8c9e8afb7b7136a440e67eeffc4a"} Dec 09 11:56:23 crc kubenswrapper[4745]: I1209 11:56:23.114689 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 11:56:23 crc kubenswrapper[4745]: I1209 11:56:23.313836 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 11:56:23 crc kubenswrapper[4745]: I1209 11:56:23.382474 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 09 11:56:23 crc kubenswrapper[4745]: I1209 11:56:23.433274 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwg9k\" (UniqueName: \"kubernetes.io/projected/d05e5297-a218-42d2-b46a-5c72201d96b4-kube-api-access-mwg9k\") pod \"d05e5297-a218-42d2-b46a-5c72201d96b4\" (UID: \"d05e5297-a218-42d2-b46a-5c72201d96b4\") " Dec 09 11:56:23 crc kubenswrapper[4745]: I1209 11:56:23.442402 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d05e5297-a218-42d2-b46a-5c72201d96b4-kube-api-access-mwg9k" (OuterVolumeSpecName: "kube-api-access-mwg9k") pod "d05e5297-a218-42d2-b46a-5c72201d96b4" (UID: "d05e5297-a218-42d2-b46a-5c72201d96b4"). InnerVolumeSpecName "kube-api-access-mwg9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:23 crc kubenswrapper[4745]: I1209 11:56:23.536898 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwg9k\" (UniqueName: \"kubernetes.io/projected/d05e5297-a218-42d2-b46a-5c72201d96b4-kube-api-access-mwg9k\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.099475 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.099726 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d05e5297-a218-42d2-b46a-5c72201d96b4","Type":"ContainerDied","Data":"065868c04fb3fb546e76bc2a69a2712f68a0b508915e6543cf4467e3a1d745df"} Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.099796 4745 scope.go:117] "RemoveContainer" containerID="523941c45b4eb75f71913c34b201f37a3dbf8c9e8afb7b7136a440e67eeffc4a" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.152648 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.168243 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.196221 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 11:56:24 crc kubenswrapper[4745]: E1209 11:56:24.197043 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d05e5297-a218-42d2-b46a-5c72201d96b4" containerName="kube-state-metrics" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.197087 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05e5297-a218-42d2-b46a-5c72201d96b4" containerName="kube-state-metrics" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.200260 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d05e5297-a218-42d2-b46a-5c72201d96b4" containerName="kube-state-metrics" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.201429 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.204095 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.204494 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.205496 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.259874 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a506f944-5b99-48af-a714-e24782ba1c06\") " pod="openstack/kube-state-metrics-0" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.260415 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a506f944-5b99-48af-a714-e24782ba1c06\") " pod="openstack/kube-state-metrics-0" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.261037 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a506f944-5b99-48af-a714-e24782ba1c06\") " pod="openstack/kube-state-metrics-0" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.261086 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfppw\" (UniqueName: \"kubernetes.io/projected/a506f944-5b99-48af-a714-e24782ba1c06-kube-api-access-zfppw\") pod \"kube-state-metrics-0\" (UID: \"a506f944-5b99-48af-a714-e24782ba1c06\") " pod="openstack/kube-state-metrics-0" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.362669 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a506f944-5b99-48af-a714-e24782ba1c06\") " pod="openstack/kube-state-metrics-0" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.362902 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a506f944-5b99-48af-a714-e24782ba1c06\") " pod="openstack/kube-state-metrics-0" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.362941 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfppw\" (UniqueName: \"kubernetes.io/projected/a506f944-5b99-48af-a714-e24782ba1c06-kube-api-access-zfppw\") pod \"kube-state-metrics-0\" (UID: \"a506f944-5b99-48af-a714-e24782ba1c06\") " pod="openstack/kube-state-metrics-0" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.362997 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a506f944-5b99-48af-a714-e24782ba1c06\") " pod="openstack/kube-state-metrics-0" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.369611 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a506f944-5b99-48af-a714-e24782ba1c06\") " pod="openstack/kube-state-metrics-0" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.371723 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a506f944-5b99-48af-a714-e24782ba1c06\") " pod="openstack/kube-state-metrics-0" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.372109 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a506f944-5b99-48af-a714-e24782ba1c06\") " pod="openstack/kube-state-metrics-0" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.389245 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfppw\" (UniqueName: \"kubernetes.io/projected/a506f944-5b99-48af-a714-e24782ba1c06-kube-api-access-zfppw\") pod \"kube-state-metrics-0\" (UID: \"a506f944-5b99-48af-a714-e24782ba1c06\") " pod="openstack/kube-state-metrics-0" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.537122 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.970143 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.970937 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerName="ceilometer-central-agent" containerID="cri-o://f80d48d2690ec3e2d9c9a4df4950f7bd370e47146539a512e64490e609477d44" gracePeriod=30 Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.970984 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerName="sg-core" containerID="cri-o://824e0d740bb85ebd55226fd00adc79f5d4ce3e05a0cacd414af41c1a48da06b2" gracePeriod=30 Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.970964 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerName="proxy-httpd" containerID="cri-o://dfe7e6bcea7cfb1bbbf79eaed213e9e18729b9c1ebaf8fc889a77be7f331d6e3" gracePeriod=30 Dec 09 11:56:24 crc kubenswrapper[4745]: I1209 11:56:24.971119 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerName="ceilometer-notification-agent" containerID="cri-o://f2dbe92f98b2b27e3ce8746ec22402c9bfc7f2f6587c0bf5c3f9de33604557a5" gracePeriod=30 Dec 09 11:56:25 crc kubenswrapper[4745]: I1209 11:56:25.026001 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 11:56:25 crc kubenswrapper[4745]: I1209 11:56:25.121571 4745 generic.go:334] "Generic (PLEG): container finished" podID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerID="dfe7e6bcea7cfb1bbbf79eaed213e9e18729b9c1ebaf8fc889a77be7f331d6e3" exitCode=0 Dec 09 11:56:25 crc kubenswrapper[4745]: I1209 11:56:25.121629 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e53e8e80-072c-424c-97e4-0e78e02db62d","Type":"ContainerDied","Data":"dfe7e6bcea7cfb1bbbf79eaed213e9e18729b9c1ebaf8fc889a77be7f331d6e3"} Dec 09 11:56:25 crc kubenswrapper[4745]: I1209 11:56:25.121658 4745 generic.go:334] "Generic (PLEG): container finished" podID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerID="824e0d740bb85ebd55226fd00adc79f5d4ce3e05a0cacd414af41c1a48da06b2" exitCode=2 Dec 09 11:56:25 crc kubenswrapper[4745]: I1209 11:56:25.121695 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e53e8e80-072c-424c-97e4-0e78e02db62d","Type":"ContainerDied","Data":"824e0d740bb85ebd55226fd00adc79f5d4ce3e05a0cacd414af41c1a48da06b2"} Dec 09 11:56:25 crc kubenswrapper[4745]: I1209 11:56:25.128659 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a506f944-5b99-48af-a714-e24782ba1c06","Type":"ContainerStarted","Data":"75f05b692bee59a7c8306779305f38ca46ce05f0854a343e4dc5681433e2f1d8"} Dec 09 11:56:25 crc kubenswrapper[4745]: I1209 11:56:25.476149 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:56:25 crc kubenswrapper[4745]: I1209 11:56:25.476876 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:56:25 crc kubenswrapper[4745]: I1209 11:56:25.576982 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d05e5297-a218-42d2-b46a-5c72201d96b4" path="/var/lib/kubelet/pods/d05e5297-a218-42d2-b46a-5c72201d96b4/volumes" Dec 09 11:56:26 crc kubenswrapper[4745]: I1209 11:56:26.141951 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a506f944-5b99-48af-a714-e24782ba1c06","Type":"ContainerStarted","Data":"84094740573a16863643bdc2da525be2a9ddb0b73b675078b658770516cc0d5e"} Dec 09 11:56:26 crc kubenswrapper[4745]: I1209 11:56:26.142502 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 09 11:56:26 crc kubenswrapper[4745]: I1209 11:56:26.147307 4745 generic.go:334] "Generic (PLEG): container finished" podID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerID="f80d48d2690ec3e2d9c9a4df4950f7bd370e47146539a512e64490e609477d44" exitCode=0 Dec 09 11:56:26 crc kubenswrapper[4745]: I1209 11:56:26.147380 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e53e8e80-072c-424c-97e4-0e78e02db62d","Type":"ContainerDied","Data":"f80d48d2690ec3e2d9c9a4df4950f7bd370e47146539a512e64490e609477d44"} Dec 09 11:56:26 crc kubenswrapper[4745]: I1209 11:56:26.160813 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.807033631 podStartE2EDuration="2.160778419s" podCreationTimestamp="2025-12-09 11:56:24 +0000 UTC" firstStartedPulling="2025-12-09 11:56:25.033424261 +0000 UTC m=+1471.858625785" lastFinishedPulling="2025-12-09 11:56:25.387169059 +0000 UTC m=+1472.212370573" observedRunningTime="2025-12-09 11:56:26.158557209 +0000 UTC m=+1472.983758753" watchObservedRunningTime="2025-12-09 11:56:26.160778419 +0000 UTC m=+1472.985979943" Dec 09 11:56:29 crc kubenswrapper[4745]: I1209 11:56:29.195456 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 11:56:29 crc kubenswrapper[4745]: I1209 11:56:29.196835 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 11:56:29 crc kubenswrapper[4745]: I1209 11:56:29.201331 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.037218 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.131147 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.196142 4745 generic.go:334] "Generic (PLEG): container finished" podID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerID="f2dbe92f98b2b27e3ce8746ec22402c9bfc7f2f6587c0bf5c3f9de33604557a5" exitCode=0 Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.196277 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e53e8e80-072c-424c-97e4-0e78e02db62d","Type":"ContainerDied","Data":"f2dbe92f98b2b27e3ce8746ec22402c9bfc7f2f6587c0bf5c3f9de33604557a5"} Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.196294 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.196313 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e53e8e80-072c-424c-97e4-0e78e02db62d","Type":"ContainerDied","Data":"1506e2507fab4fa0c7181cc08f8af000d7c9ec6842d0fb97e34ec1541900481e"} Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.196342 4745 scope.go:117] "RemoveContainer" containerID="dfe7e6bcea7cfb1bbbf79eaed213e9e18729b9c1ebaf8fc889a77be7f331d6e3" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.199901 4745 generic.go:334] "Generic (PLEG): container finished" podID="734b7a03-cf8d-44cc-abd1-05219561080c" containerID="8a30b773a5a2aed0dc4ec1abd6b91b7918c1c394c635ca0dab7b6e7c9dc64513" exitCode=137 Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.200595 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.200621 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"734b7a03-cf8d-44cc-abd1-05219561080c","Type":"ContainerDied","Data":"8a30b773a5a2aed0dc4ec1abd6b91b7918c1c394c635ca0dab7b6e7c9dc64513"} Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.200662 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"734b7a03-cf8d-44cc-abd1-05219561080c","Type":"ContainerDied","Data":"079d4f417559e492cc8c54bf3f9d2d88cc82916f811a58c33552b6c7063103e2"} Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.205353 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b7a03-cf8d-44cc-abd1-05219561080c-combined-ca-bundle\") pod \"734b7a03-cf8d-44cc-abd1-05219561080c\" (UID: \"734b7a03-cf8d-44cc-abd1-05219561080c\") " Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.205395 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gn8w\" (UniqueName: \"kubernetes.io/projected/734b7a03-cf8d-44cc-abd1-05219561080c-kube-api-access-7gn8w\") pod \"734b7a03-cf8d-44cc-abd1-05219561080c\" (UID: \"734b7a03-cf8d-44cc-abd1-05219561080c\") " Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.205539 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b7a03-cf8d-44cc-abd1-05219561080c-config-data\") pod \"734b7a03-cf8d-44cc-abd1-05219561080c\" (UID: \"734b7a03-cf8d-44cc-abd1-05219561080c\") " Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.209439 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.214449 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/734b7a03-cf8d-44cc-abd1-05219561080c-kube-api-access-7gn8w" (OuterVolumeSpecName: "kube-api-access-7gn8w") pod "734b7a03-cf8d-44cc-abd1-05219561080c" (UID: "734b7a03-cf8d-44cc-abd1-05219561080c"). InnerVolumeSpecName "kube-api-access-7gn8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.223127 4745 scope.go:117] "RemoveContainer" containerID="824e0d740bb85ebd55226fd00adc79f5d4ce3e05a0cacd414af41c1a48da06b2" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.250758 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734b7a03-cf8d-44cc-abd1-05219561080c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "734b7a03-cf8d-44cc-abd1-05219561080c" (UID: "734b7a03-cf8d-44cc-abd1-05219561080c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.268864 4745 scope.go:117] "RemoveContainer" containerID="f2dbe92f98b2b27e3ce8746ec22402c9bfc7f2f6587c0bf5c3f9de33604557a5" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.269927 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734b7a03-cf8d-44cc-abd1-05219561080c-config-data" (OuterVolumeSpecName: "config-data") pod "734b7a03-cf8d-44cc-abd1-05219561080c" (UID: "734b7a03-cf8d-44cc-abd1-05219561080c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.299157 4745 scope.go:117] "RemoveContainer" containerID="f80d48d2690ec3e2d9c9a4df4950f7bd370e47146539a512e64490e609477d44" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.308621 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-sg-core-conf-yaml\") pod \"e53e8e80-072c-424c-97e4-0e78e02db62d\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.308715 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-scripts\") pod \"e53e8e80-072c-424c-97e4-0e78e02db62d\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.308759 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-config-data\") pod \"e53e8e80-072c-424c-97e4-0e78e02db62d\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.308804 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e53e8e80-072c-424c-97e4-0e78e02db62d-log-httpd\") pod \"e53e8e80-072c-424c-97e4-0e78e02db62d\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.308924 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-combined-ca-bundle\") pod \"e53e8e80-072c-424c-97e4-0e78e02db62d\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.309026 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt6c4\" (UniqueName: \"kubernetes.io/projected/e53e8e80-072c-424c-97e4-0e78e02db62d-kube-api-access-vt6c4\") pod \"e53e8e80-072c-424c-97e4-0e78e02db62d\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.309063 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e53e8e80-072c-424c-97e4-0e78e02db62d-run-httpd\") pod \"e53e8e80-072c-424c-97e4-0e78e02db62d\" (UID: \"e53e8e80-072c-424c-97e4-0e78e02db62d\") " Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.309612 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b7a03-cf8d-44cc-abd1-05219561080c-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.309632 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b7a03-cf8d-44cc-abd1-05219561080c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.309645 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gn8w\" (UniqueName: \"kubernetes.io/projected/734b7a03-cf8d-44cc-abd1-05219561080c-kube-api-access-7gn8w\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.311435 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e53e8e80-072c-424c-97e4-0e78e02db62d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e53e8e80-072c-424c-97e4-0e78e02db62d" (UID: "e53e8e80-072c-424c-97e4-0e78e02db62d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.312163 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e53e8e80-072c-424c-97e4-0e78e02db62d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e53e8e80-072c-424c-97e4-0e78e02db62d" (UID: "e53e8e80-072c-424c-97e4-0e78e02db62d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.314024 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e53e8e80-072c-424c-97e4-0e78e02db62d-kube-api-access-vt6c4" (OuterVolumeSpecName: "kube-api-access-vt6c4") pod "e53e8e80-072c-424c-97e4-0e78e02db62d" (UID: "e53e8e80-072c-424c-97e4-0e78e02db62d"). InnerVolumeSpecName "kube-api-access-vt6c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.326376 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-scripts" (OuterVolumeSpecName: "scripts") pod "e53e8e80-072c-424c-97e4-0e78e02db62d" (UID: "e53e8e80-072c-424c-97e4-0e78e02db62d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.326672 4745 scope.go:117] "RemoveContainer" containerID="dfe7e6bcea7cfb1bbbf79eaed213e9e18729b9c1ebaf8fc889a77be7f331d6e3" Dec 09 11:56:30 crc kubenswrapper[4745]: E1209 11:56:30.327430 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfe7e6bcea7cfb1bbbf79eaed213e9e18729b9c1ebaf8fc889a77be7f331d6e3\": container with ID starting with dfe7e6bcea7cfb1bbbf79eaed213e9e18729b9c1ebaf8fc889a77be7f331d6e3 not found: ID does not exist" containerID="dfe7e6bcea7cfb1bbbf79eaed213e9e18729b9c1ebaf8fc889a77be7f331d6e3" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.327552 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe7e6bcea7cfb1bbbf79eaed213e9e18729b9c1ebaf8fc889a77be7f331d6e3"} err="failed to get container status \"dfe7e6bcea7cfb1bbbf79eaed213e9e18729b9c1ebaf8fc889a77be7f331d6e3\": rpc error: code = NotFound desc = could not find container \"dfe7e6bcea7cfb1bbbf79eaed213e9e18729b9c1ebaf8fc889a77be7f331d6e3\": container with ID starting with dfe7e6bcea7cfb1bbbf79eaed213e9e18729b9c1ebaf8fc889a77be7f331d6e3 not found: ID does not exist" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.327638 4745 scope.go:117] "RemoveContainer" containerID="824e0d740bb85ebd55226fd00adc79f5d4ce3e05a0cacd414af41c1a48da06b2" Dec 09 11:56:30 crc kubenswrapper[4745]: E1209 11:56:30.328078 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"824e0d740bb85ebd55226fd00adc79f5d4ce3e05a0cacd414af41c1a48da06b2\": container with ID starting with 824e0d740bb85ebd55226fd00adc79f5d4ce3e05a0cacd414af41c1a48da06b2 not found: ID does not exist" containerID="824e0d740bb85ebd55226fd00adc79f5d4ce3e05a0cacd414af41c1a48da06b2" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.328164 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824e0d740bb85ebd55226fd00adc79f5d4ce3e05a0cacd414af41c1a48da06b2"} err="failed to get container status \"824e0d740bb85ebd55226fd00adc79f5d4ce3e05a0cacd414af41c1a48da06b2\": rpc error: code = NotFound desc = could not find container \"824e0d740bb85ebd55226fd00adc79f5d4ce3e05a0cacd414af41c1a48da06b2\": container with ID starting with 824e0d740bb85ebd55226fd00adc79f5d4ce3e05a0cacd414af41c1a48da06b2 not found: ID does not exist" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.328236 4745 scope.go:117] "RemoveContainer" containerID="f2dbe92f98b2b27e3ce8746ec22402c9bfc7f2f6587c0bf5c3f9de33604557a5" Dec 09 11:56:30 crc kubenswrapper[4745]: E1209 11:56:30.328724 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2dbe92f98b2b27e3ce8746ec22402c9bfc7f2f6587c0bf5c3f9de33604557a5\": container with ID starting with f2dbe92f98b2b27e3ce8746ec22402c9bfc7f2f6587c0bf5c3f9de33604557a5 not found: ID does not exist" containerID="f2dbe92f98b2b27e3ce8746ec22402c9bfc7f2f6587c0bf5c3f9de33604557a5" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.328783 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2dbe92f98b2b27e3ce8746ec22402c9bfc7f2f6587c0bf5c3f9de33604557a5"} err="failed to get container status \"f2dbe92f98b2b27e3ce8746ec22402c9bfc7f2f6587c0bf5c3f9de33604557a5\": rpc error: code = NotFound desc = could not find container \"f2dbe92f98b2b27e3ce8746ec22402c9bfc7f2f6587c0bf5c3f9de33604557a5\": container with ID starting with f2dbe92f98b2b27e3ce8746ec22402c9bfc7f2f6587c0bf5c3f9de33604557a5 not found: ID does not exist" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.328829 4745 scope.go:117] "RemoveContainer" containerID="f80d48d2690ec3e2d9c9a4df4950f7bd370e47146539a512e64490e609477d44" Dec 09 11:56:30 crc kubenswrapper[4745]: E1209 11:56:30.329377 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f80d48d2690ec3e2d9c9a4df4950f7bd370e47146539a512e64490e609477d44\": container with ID starting with f80d48d2690ec3e2d9c9a4df4950f7bd370e47146539a512e64490e609477d44 not found: ID does not exist" containerID="f80d48d2690ec3e2d9c9a4df4950f7bd370e47146539a512e64490e609477d44" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.329469 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80d48d2690ec3e2d9c9a4df4950f7bd370e47146539a512e64490e609477d44"} err="failed to get container status \"f80d48d2690ec3e2d9c9a4df4950f7bd370e47146539a512e64490e609477d44\": rpc error: code = NotFound desc = could not find container \"f80d48d2690ec3e2d9c9a4df4950f7bd370e47146539a512e64490e609477d44\": container with ID starting with f80d48d2690ec3e2d9c9a4df4950f7bd370e47146539a512e64490e609477d44 not found: ID does not exist" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.329556 4745 scope.go:117] "RemoveContainer" containerID="8a30b773a5a2aed0dc4ec1abd6b91b7918c1c394c635ca0dab7b6e7c9dc64513" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.341132 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e53e8e80-072c-424c-97e4-0e78e02db62d" (UID: "e53e8e80-072c-424c-97e4-0e78e02db62d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.359667 4745 scope.go:117] "RemoveContainer" containerID="8a30b773a5a2aed0dc4ec1abd6b91b7918c1c394c635ca0dab7b6e7c9dc64513" Dec 09 11:56:30 crc kubenswrapper[4745]: E1209 11:56:30.360535 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a30b773a5a2aed0dc4ec1abd6b91b7918c1c394c635ca0dab7b6e7c9dc64513\": container with ID starting with 8a30b773a5a2aed0dc4ec1abd6b91b7918c1c394c635ca0dab7b6e7c9dc64513 not found: ID does not exist" containerID="8a30b773a5a2aed0dc4ec1abd6b91b7918c1c394c635ca0dab7b6e7c9dc64513" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.360631 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a30b773a5a2aed0dc4ec1abd6b91b7918c1c394c635ca0dab7b6e7c9dc64513"} err="failed to get container status \"8a30b773a5a2aed0dc4ec1abd6b91b7918c1c394c635ca0dab7b6e7c9dc64513\": rpc error: code = NotFound desc = could not find container \"8a30b773a5a2aed0dc4ec1abd6b91b7918c1c394c635ca0dab7b6e7c9dc64513\": container with ID starting with 8a30b773a5a2aed0dc4ec1abd6b91b7918c1c394c635ca0dab7b6e7c9dc64513 not found: ID does not exist" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.399788 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e53e8e80-072c-424c-97e4-0e78e02db62d" (UID: "e53e8e80-072c-424c-97e4-0e78e02db62d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.413284 4745 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.413336 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.413351 4745 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e53e8e80-072c-424c-97e4-0e78e02db62d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.413364 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.413378 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt6c4\" (UniqueName: \"kubernetes.io/projected/e53e8e80-072c-424c-97e4-0e78e02db62d-kube-api-access-vt6c4\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.413394 4745 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e53e8e80-072c-424c-97e4-0e78e02db62d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.430354 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-config-data" (OuterVolumeSpecName: "config-data") pod "e53e8e80-072c-424c-97e4-0e78e02db62d" (UID: "e53e8e80-072c-424c-97e4-0e78e02db62d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.515368 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53e8e80-072c-424c-97e4-0e78e02db62d-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.552107 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.567224 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.600133 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.643039 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:56:30 crc kubenswrapper[4745]: E1209 11:56:30.644565 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerName="sg-core" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.644595 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerName="sg-core" Dec 09 11:56:30 crc kubenswrapper[4745]: E1209 11:56:30.644617 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerName="proxy-httpd" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.644626 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerName="proxy-httpd" Dec 09 11:56:30 crc kubenswrapper[4745]: E1209 11:56:30.644637 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="734b7a03-cf8d-44cc-abd1-05219561080c" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.644644 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="734b7a03-cf8d-44cc-abd1-05219561080c" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 11:56:30 crc kubenswrapper[4745]: E1209 11:56:30.644675 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerName="ceilometer-notification-agent" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.644683 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerName="ceilometer-notification-agent" Dec 09 11:56:30 crc kubenswrapper[4745]: E1209 11:56:30.644696 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerName="ceilometer-central-agent" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.644703 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerName="ceilometer-central-agent" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.645070 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerName="ceilometer-notification-agent" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.645089 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="734b7a03-cf8d-44cc-abd1-05219561080c" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.645104 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerName="sg-core" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.645123 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerName="ceilometer-central-agent" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.645137 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" containerName="proxy-httpd" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.649290 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.654542 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.654708 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.655088 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.666176 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.684108 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.697203 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.698807 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.701631 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.702149 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.702905 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.715112 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.735758 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.735811 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-scripts\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.735860 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4010d862-1eb2-492d-98c6-1cf34f133ffb-run-httpd\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.735883 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.735917 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rtkd\" (UniqueName: \"kubernetes.io/projected/4010d862-1eb2-492d-98c6-1cf34f133ffb-kube-api-access-2rtkd\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.735939 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.735970 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nlmw\" (UniqueName: \"kubernetes.io/projected/2193006f-4e85-4b55-a6ab-9237f4c9888f-kube-api-access-4nlmw\") pod \"nova-cell1-novncproxy-0\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.735990 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.736080 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.736229 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.736422 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-config-data\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.736472 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.736595 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4010d862-1eb2-492d-98c6-1cf34f133ffb-log-httpd\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.837680 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nlmw\" (UniqueName: \"kubernetes.io/projected/2193006f-4e85-4b55-a6ab-9237f4c9888f-kube-api-access-4nlmw\") pod \"nova-cell1-novncproxy-0\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.837748 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.837787 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.837829 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.837919 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-config-data\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.837956 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.838048 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4010d862-1eb2-492d-98c6-1cf34f133ffb-log-httpd\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.838130 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.838167 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-scripts\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.838230 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4010d862-1eb2-492d-98c6-1cf34f133ffb-run-httpd\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.838273 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.838311 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rtkd\" (UniqueName: \"kubernetes.io/projected/4010d862-1eb2-492d-98c6-1cf34f133ffb-kube-api-access-2rtkd\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.838348 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.840889 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4010d862-1eb2-492d-98c6-1cf34f133ffb-log-httpd\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.841013 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4010d862-1eb2-492d-98c6-1cf34f133ffb-run-httpd\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.847920 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.853714 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.853866 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.854101 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.854198 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-config-data\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.854242 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.854263 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-scripts\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.855001 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.857671 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.859672 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nlmw\" (UniqueName: \"kubernetes.io/projected/2193006f-4e85-4b55-a6ab-9237f4c9888f-kube-api-access-4nlmw\") pod \"nova-cell1-novncproxy-0\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:30 crc kubenswrapper[4745]: I1209 11:56:30.861028 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rtkd\" (UniqueName: \"kubernetes.io/projected/4010d862-1eb2-492d-98c6-1cf34f133ffb-kube-api-access-2rtkd\") pod \"ceilometer-0\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " pod="openstack/ceilometer-0" Dec 09 11:56:31 crc kubenswrapper[4745]: I1209 11:56:31.018914 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:56:31 crc kubenswrapper[4745]: I1209 11:56:31.032682 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:31 crc kubenswrapper[4745]: I1209 11:56:31.302700 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 11:56:31 crc kubenswrapper[4745]: I1209 11:56:31.303739 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 11:56:31 crc kubenswrapper[4745]: I1209 11:56:31.308144 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 11:56:31 crc kubenswrapper[4745]: I1209 11:56:31.318953 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 11:56:31 crc kubenswrapper[4745]: I1209 11:56:31.572277 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="734b7a03-cf8d-44cc-abd1-05219561080c" path="/var/lib/kubelet/pods/734b7a03-cf8d-44cc-abd1-05219561080c/volumes" Dec 09 11:56:31 crc kubenswrapper[4745]: I1209 11:56:31.573725 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e53e8e80-072c-424c-97e4-0e78e02db62d" path="/var/lib/kubelet/pods/e53e8e80-072c-424c-97e4-0e78e02db62d/volumes" Dec 09 11:56:31 crc kubenswrapper[4745]: I1209 11:56:31.613761 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:56:31 crc kubenswrapper[4745]: I1209 11:56:31.685240 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.240761 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2193006f-4e85-4b55-a6ab-9237f4c9888f","Type":"ContainerStarted","Data":"f966743286a79147a7637b37640d7567a316c8d3352249f62fcd961a1b853aed"} Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.241553 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2193006f-4e85-4b55-a6ab-9237f4c9888f","Type":"ContainerStarted","Data":"e11f875f937a177291e0bd6654f176ad55033b644b0ff0a3f2dc38fa91d224f0"} Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.246390 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4010d862-1eb2-492d-98c6-1cf34f133ffb","Type":"ContainerStarted","Data":"1b3397ff46b7e06ad1d52246db510c9653cfa63473a4e2392716d4b549dd00d7"} Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.246430 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.250590 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.266898 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.266879708 podStartE2EDuration="2.266879708s" podCreationTimestamp="2025-12-09 11:56:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:56:32.264936786 +0000 UTC m=+1479.090138310" watchObservedRunningTime="2025-12-09 11:56:32.266879708 +0000 UTC m=+1479.092081232" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.530347 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-867cd545c7-gxt6p"] Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.533892 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.571558 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-867cd545c7-gxt6p"] Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.586963 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-config\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.587112 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-dns-swift-storage-0\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.587143 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-ovsdbserver-sb\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.587163 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-ovsdbserver-nb\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.587252 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-dns-svc\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.587280 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86sck\" (UniqueName: \"kubernetes.io/projected/2f843182-a85c-47cf-ba16-414c40a031c5-kube-api-access-86sck\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.690086 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-dns-swift-storage-0\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.690732 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-ovsdbserver-sb\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.690763 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-ovsdbserver-nb\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.691357 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-dns-swift-storage-0\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.692201 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-ovsdbserver-nb\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.691898 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-ovsdbserver-sb\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.692342 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-dns-svc\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.692400 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86sck\" (UniqueName: \"kubernetes.io/projected/2f843182-a85c-47cf-ba16-414c40a031c5-kube-api-access-86sck\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.692457 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-config\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.693690 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-config\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.693733 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-dns-svc\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.721451 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86sck\" (UniqueName: \"kubernetes.io/projected/2f843182-a85c-47cf-ba16-414c40a031c5-kube-api-access-86sck\") pod \"dnsmasq-dns-867cd545c7-gxt6p\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:32 crc kubenswrapper[4745]: I1209 11:56:32.876920 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:33 crc kubenswrapper[4745]: I1209 11:56:33.304578 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4010d862-1eb2-492d-98c6-1cf34f133ffb","Type":"ContainerStarted","Data":"d1c155d532305ad905bd0c171b9255b26f9e630f16a98e235817dd234795628e"} Dec 09 11:56:33 crc kubenswrapper[4745]: I1209 11:56:33.451425 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-867cd545c7-gxt6p"] Dec 09 11:56:33 crc kubenswrapper[4745]: W1209 11:56:33.453050 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f843182_a85c_47cf_ba16_414c40a031c5.slice/crio-668cfd4de6cfc22ce653d3913709f0add8a26ece3211b567d5a1b8ea2048775a WatchSource:0}: Error finding container 668cfd4de6cfc22ce653d3913709f0add8a26ece3211b567d5a1b8ea2048775a: Status 404 returned error can't find the container with id 668cfd4de6cfc22ce653d3913709f0add8a26ece3211b567d5a1b8ea2048775a Dec 09 11:56:34 crc kubenswrapper[4745]: I1209 11:56:34.317545 4745 generic.go:334] "Generic (PLEG): container finished" podID="2f843182-a85c-47cf-ba16-414c40a031c5" containerID="97ec8fabc6b86620361830e04b72f36435b3629bd52098c9979c36dcf3397fc0" exitCode=0 Dec 09 11:56:34 crc kubenswrapper[4745]: I1209 11:56:34.317670 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" event={"ID":"2f843182-a85c-47cf-ba16-414c40a031c5","Type":"ContainerDied","Data":"97ec8fabc6b86620361830e04b72f36435b3629bd52098c9979c36dcf3397fc0"} Dec 09 11:56:34 crc kubenswrapper[4745]: I1209 11:56:34.318114 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" event={"ID":"2f843182-a85c-47cf-ba16-414c40a031c5","Type":"ContainerStarted","Data":"668cfd4de6cfc22ce653d3913709f0add8a26ece3211b567d5a1b8ea2048775a"} Dec 09 11:56:34 crc kubenswrapper[4745]: I1209 11:56:34.324220 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4010d862-1eb2-492d-98c6-1cf34f133ffb","Type":"ContainerStarted","Data":"722a6efee37e4f56315328c5153e7ea63c736f957ce596014cedd3fd3fff3983"} Dec 09 11:56:34 crc kubenswrapper[4745]: I1209 11:56:34.568908 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 09 11:56:35 crc kubenswrapper[4745]: I1209 11:56:35.336185 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4010d862-1eb2-492d-98c6-1cf34f133ffb","Type":"ContainerStarted","Data":"8b241e069440569059ca7a87868e57cca5ee19b7640908cc49fc6f903154c8a2"} Dec 09 11:56:35 crc kubenswrapper[4745]: I1209 11:56:35.337941 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" event={"ID":"2f843182-a85c-47cf-ba16-414c40a031c5","Type":"ContainerStarted","Data":"543eb1f623aed841beae952f8e46e93a8e7442d03c588d9a15fad9c843e15cc2"} Dec 09 11:56:35 crc kubenswrapper[4745]: I1209 11:56:35.338780 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:35 crc kubenswrapper[4745]: I1209 11:56:35.374905 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" podStartSLOduration=3.374873829 podStartE2EDuration="3.374873829s" podCreationTimestamp="2025-12-09 11:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:56:35.363959075 +0000 UTC m=+1482.189160609" watchObservedRunningTime="2025-12-09 11:56:35.374873829 +0000 UTC m=+1482.200075353" Dec 09 11:56:35 crc kubenswrapper[4745]: I1209 11:56:35.724420 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:56:35 crc kubenswrapper[4745]: I1209 11:56:35.747066 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:56:35 crc kubenswrapper[4745]: I1209 11:56:35.747369 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d7710bc2-5cca-45ef-8464-8e02aa75adc2" containerName="nova-api-log" containerID="cri-o://ab6fa1af4f7059722b7284aa0c0ffcb9e234cc96736c8edf3c78e4a735fb551e" gracePeriod=30 Dec 09 11:56:35 crc kubenswrapper[4745]: I1209 11:56:35.747689 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d7710bc2-5cca-45ef-8464-8e02aa75adc2" containerName="nova-api-api" containerID="cri-o://5e96df06f0c362e2167c371ab2282d459752dd30a40ae2aaa99fe0e740343b7e" gracePeriod=30 Dec 09 11:56:36 crc kubenswrapper[4745]: I1209 11:56:36.033996 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:36 crc kubenswrapper[4745]: I1209 11:56:36.351140 4745 generic.go:334] "Generic (PLEG): container finished" podID="d7710bc2-5cca-45ef-8464-8e02aa75adc2" containerID="ab6fa1af4f7059722b7284aa0c0ffcb9e234cc96736c8edf3c78e4a735fb551e" exitCode=143 Dec 09 11:56:36 crc kubenswrapper[4745]: I1209 11:56:36.352939 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7710bc2-5cca-45ef-8464-8e02aa75adc2","Type":"ContainerDied","Data":"ab6fa1af4f7059722b7284aa0c0ffcb9e234cc96736c8edf3c78e4a735fb551e"} Dec 09 11:56:37 crc kubenswrapper[4745]: I1209 11:56:37.370476 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4010d862-1eb2-492d-98c6-1cf34f133ffb","Type":"ContainerStarted","Data":"d822cbfb8de56e4832091c0de915519951a0cddf7692f0d2f792281dc1dfdbfb"} Dec 09 11:56:37 crc kubenswrapper[4745]: I1209 11:56:37.371115 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 11:56:37 crc kubenswrapper[4745]: I1209 11:56:37.371187 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerName="ceilometer-central-agent" containerID="cri-o://d1c155d532305ad905bd0c171b9255b26f9e630f16a98e235817dd234795628e" gracePeriod=30 Dec 09 11:56:37 crc kubenswrapper[4745]: I1209 11:56:37.371285 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerName="sg-core" containerID="cri-o://8b241e069440569059ca7a87868e57cca5ee19b7640908cc49fc6f903154c8a2" gracePeriod=30 Dec 09 11:56:37 crc kubenswrapper[4745]: I1209 11:56:37.371344 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerName="proxy-httpd" containerID="cri-o://d822cbfb8de56e4832091c0de915519951a0cddf7692f0d2f792281dc1dfdbfb" gracePeriod=30 Dec 09 11:56:37 crc kubenswrapper[4745]: I1209 11:56:37.371386 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerName="ceilometer-notification-agent" containerID="cri-o://722a6efee37e4f56315328c5153e7ea63c736f957ce596014cedd3fd3fff3983" gracePeriod=30 Dec 09 11:56:37 crc kubenswrapper[4745]: I1209 11:56:37.408104 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.707628502 podStartE2EDuration="7.408076581s" podCreationTimestamp="2025-12-09 11:56:30 +0000 UTC" firstStartedPulling="2025-12-09 11:56:31.704733221 +0000 UTC m=+1478.529934745" lastFinishedPulling="2025-12-09 11:56:36.4051813 +0000 UTC m=+1483.230382824" observedRunningTime="2025-12-09 11:56:37.398681538 +0000 UTC m=+1484.223883062" watchObservedRunningTime="2025-12-09 11:56:37.408076581 +0000 UTC m=+1484.233278105" Dec 09 11:56:38 crc kubenswrapper[4745]: I1209 11:56:38.384866 4745 generic.go:334] "Generic (PLEG): container finished" podID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerID="d822cbfb8de56e4832091c0de915519951a0cddf7692f0d2f792281dc1dfdbfb" exitCode=0 Dec 09 11:56:38 crc kubenswrapper[4745]: I1209 11:56:38.384907 4745 generic.go:334] "Generic (PLEG): container finished" podID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerID="8b241e069440569059ca7a87868e57cca5ee19b7640908cc49fc6f903154c8a2" exitCode=2 Dec 09 11:56:38 crc kubenswrapper[4745]: I1209 11:56:38.384917 4745 generic.go:334] "Generic (PLEG): container finished" podID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerID="722a6efee37e4f56315328c5153e7ea63c736f957ce596014cedd3fd3fff3983" exitCode=0 Dec 09 11:56:38 crc kubenswrapper[4745]: I1209 11:56:38.384946 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4010d862-1eb2-492d-98c6-1cf34f133ffb","Type":"ContainerDied","Data":"d822cbfb8de56e4832091c0de915519951a0cddf7692f0d2f792281dc1dfdbfb"} Dec 09 11:56:38 crc kubenswrapper[4745]: I1209 11:56:38.384985 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4010d862-1eb2-492d-98c6-1cf34f133ffb","Type":"ContainerDied","Data":"8b241e069440569059ca7a87868e57cca5ee19b7640908cc49fc6f903154c8a2"} Dec 09 11:56:38 crc kubenswrapper[4745]: I1209 11:56:38.384998 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4010d862-1eb2-492d-98c6-1cf34f133ffb","Type":"ContainerDied","Data":"722a6efee37e4f56315328c5153e7ea63c736f957ce596014cedd3fd3fff3983"} Dec 09 11:56:40 crc kubenswrapper[4745]: I1209 11:56:40.410839 4745 generic.go:334] "Generic (PLEG): container finished" podID="d7710bc2-5cca-45ef-8464-8e02aa75adc2" containerID="5e96df06f0c362e2167c371ab2282d459752dd30a40ae2aaa99fe0e740343b7e" exitCode=0 Dec 09 11:56:40 crc kubenswrapper[4745]: I1209 11:56:40.410928 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7710bc2-5cca-45ef-8464-8e02aa75adc2","Type":"ContainerDied","Data":"5e96df06f0c362e2167c371ab2282d459752dd30a40ae2aaa99fe0e740343b7e"} Dec 09 11:56:40 crc kubenswrapper[4745]: I1209 11:56:40.752546 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:56:40 crc kubenswrapper[4745]: I1209 11:56:40.902582 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7710bc2-5cca-45ef-8464-8e02aa75adc2-logs\") pod \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\" (UID: \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\") " Dec 09 11:56:40 crc kubenswrapper[4745]: I1209 11:56:40.902786 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7710bc2-5cca-45ef-8464-8e02aa75adc2-config-data\") pod \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\" (UID: \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\") " Dec 09 11:56:40 crc kubenswrapper[4745]: I1209 11:56:40.903124 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7710bc2-5cca-45ef-8464-8e02aa75adc2-combined-ca-bundle\") pod \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\" (UID: \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\") " Dec 09 11:56:40 crc kubenswrapper[4745]: I1209 11:56:40.903852 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbncb\" (UniqueName: \"kubernetes.io/projected/d7710bc2-5cca-45ef-8464-8e02aa75adc2-kube-api-access-pbncb\") pod \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\" (UID: \"d7710bc2-5cca-45ef-8464-8e02aa75adc2\") " Dec 09 11:56:40 crc kubenswrapper[4745]: I1209 11:56:40.903982 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7710bc2-5cca-45ef-8464-8e02aa75adc2-logs" (OuterVolumeSpecName: "logs") pod "d7710bc2-5cca-45ef-8464-8e02aa75adc2" (UID: "d7710bc2-5cca-45ef-8464-8e02aa75adc2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:56:40 crc kubenswrapper[4745]: I1209 11:56:40.905787 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7710bc2-5cca-45ef-8464-8e02aa75adc2-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:40 crc kubenswrapper[4745]: I1209 11:56:40.935885 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7710bc2-5cca-45ef-8464-8e02aa75adc2-kube-api-access-pbncb" (OuterVolumeSpecName: "kube-api-access-pbncb") pod "d7710bc2-5cca-45ef-8464-8e02aa75adc2" (UID: "d7710bc2-5cca-45ef-8464-8e02aa75adc2"). InnerVolumeSpecName "kube-api-access-pbncb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:40 crc kubenswrapper[4745]: I1209 11:56:40.975783 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7710bc2-5cca-45ef-8464-8e02aa75adc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7710bc2-5cca-45ef-8464-8e02aa75adc2" (UID: "d7710bc2-5cca-45ef-8464-8e02aa75adc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:40 crc kubenswrapper[4745]: I1209 11:56:40.984781 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7710bc2-5cca-45ef-8464-8e02aa75adc2-config-data" (OuterVolumeSpecName: "config-data") pod "d7710bc2-5cca-45ef-8464-8e02aa75adc2" (UID: "d7710bc2-5cca-45ef-8464-8e02aa75adc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.008694 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7710bc2-5cca-45ef-8464-8e02aa75adc2-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.008729 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7710bc2-5cca-45ef-8464-8e02aa75adc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.008741 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbncb\" (UniqueName: \"kubernetes.io/projected/d7710bc2-5cca-45ef-8464-8e02aa75adc2-kube-api-access-pbncb\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.034316 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.073623 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.087541 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.111967 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-ceilometer-tls-certs\") pod \"4010d862-1eb2-492d-98c6-1cf34f133ffb\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.112037 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-sg-core-conf-yaml\") pod \"4010d862-1eb2-492d-98c6-1cf34f133ffb\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.112080 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rtkd\" (UniqueName: \"kubernetes.io/projected/4010d862-1eb2-492d-98c6-1cf34f133ffb-kube-api-access-2rtkd\") pod \"4010d862-1eb2-492d-98c6-1cf34f133ffb\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.116179 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4010d862-1eb2-492d-98c6-1cf34f133ffb-kube-api-access-2rtkd" (OuterVolumeSpecName: "kube-api-access-2rtkd") pod "4010d862-1eb2-492d-98c6-1cf34f133ffb" (UID: "4010d862-1eb2-492d-98c6-1cf34f133ffb"). InnerVolumeSpecName "kube-api-access-2rtkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.160211 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4010d862-1eb2-492d-98c6-1cf34f133ffb" (UID: "4010d862-1eb2-492d-98c6-1cf34f133ffb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.188505 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4010d862-1eb2-492d-98c6-1cf34f133ffb" (UID: "4010d862-1eb2-492d-98c6-1cf34f133ffb"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.216867 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-combined-ca-bundle\") pod \"4010d862-1eb2-492d-98c6-1cf34f133ffb\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.217054 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-scripts\") pod \"4010d862-1eb2-492d-98c6-1cf34f133ffb\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.217099 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4010d862-1eb2-492d-98c6-1cf34f133ffb-log-httpd\") pod \"4010d862-1eb2-492d-98c6-1cf34f133ffb\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.217251 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4010d862-1eb2-492d-98c6-1cf34f133ffb-run-httpd\") pod \"4010d862-1eb2-492d-98c6-1cf34f133ffb\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.217411 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-config-data\") pod \"4010d862-1eb2-492d-98c6-1cf34f133ffb\" (UID: \"4010d862-1eb2-492d-98c6-1cf34f133ffb\") " Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.217906 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4010d862-1eb2-492d-98c6-1cf34f133ffb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4010d862-1eb2-492d-98c6-1cf34f133ffb" (UID: "4010d862-1eb2-492d-98c6-1cf34f133ffb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.217957 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4010d862-1eb2-492d-98c6-1cf34f133ffb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4010d862-1eb2-492d-98c6-1cf34f133ffb" (UID: "4010d862-1eb2-492d-98c6-1cf34f133ffb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.218305 4745 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.218329 4745 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.218341 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rtkd\" (UniqueName: \"kubernetes.io/projected/4010d862-1eb2-492d-98c6-1cf34f133ffb-kube-api-access-2rtkd\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.218356 4745 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4010d862-1eb2-492d-98c6-1cf34f133ffb-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.218365 4745 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4010d862-1eb2-492d-98c6-1cf34f133ffb-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.221315 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-scripts" (OuterVolumeSpecName: "scripts") pod "4010d862-1eb2-492d-98c6-1cf34f133ffb" (UID: "4010d862-1eb2-492d-98c6-1cf34f133ffb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.318388 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4010d862-1eb2-492d-98c6-1cf34f133ffb" (UID: "4010d862-1eb2-492d-98c6-1cf34f133ffb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.320731 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.320778 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.343373 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-config-data" (OuterVolumeSpecName: "config-data") pod "4010d862-1eb2-492d-98c6-1cf34f133ffb" (UID: "4010d862-1eb2-492d-98c6-1cf34f133ffb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.422678 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7710bc2-5cca-45ef-8464-8e02aa75adc2","Type":"ContainerDied","Data":"8a8a0e51324d52bd72bc55d6122c8f9d25b3d66d058e0ed21e9ac001e794f1e5"} Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.423674 4745 scope.go:117] "RemoveContainer" containerID="5e96df06f0c362e2167c371ab2282d459752dd30a40ae2aaa99fe0e740343b7e" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.422740 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.423090 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4010d862-1eb2-492d-98c6-1cf34f133ffb-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.426861 4745 generic.go:334] "Generic (PLEG): container finished" podID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerID="d1c155d532305ad905bd0c171b9255b26f9e630f16a98e235817dd234795628e" exitCode=0 Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.428858 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.430592 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4010d862-1eb2-492d-98c6-1cf34f133ffb","Type":"ContainerDied","Data":"d1c155d532305ad905bd0c171b9255b26f9e630f16a98e235817dd234795628e"} Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.430648 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4010d862-1eb2-492d-98c6-1cf34f133ffb","Type":"ContainerDied","Data":"1b3397ff46b7e06ad1d52246db510c9653cfa63473a4e2392716d4b549dd00d7"} Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.450362 4745 scope.go:117] "RemoveContainer" containerID="ab6fa1af4f7059722b7284aa0c0ffcb9e234cc96736c8edf3c78e4a735fb551e" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.455299 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.475072 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.481161 4745 scope.go:117] "RemoveContainer" containerID="d822cbfb8de56e4832091c0de915519951a0cddf7692f0d2f792281dc1dfdbfb" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.494357 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.527695 4745 scope.go:117] "RemoveContainer" containerID="8b241e069440569059ca7a87868e57cca5ee19b7640908cc49fc6f903154c8a2" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.561887 4745 scope.go:117] "RemoveContainer" containerID="722a6efee37e4f56315328c5153e7ea63c736f957ce596014cedd3fd3fff3983" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.575382 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7710bc2-5cca-45ef-8464-8e02aa75adc2" path="/var/lib/kubelet/pods/d7710bc2-5cca-45ef-8464-8e02aa75adc2/volumes" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.576815 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 11:56:41 crc kubenswrapper[4745]: E1209 11:56:41.577291 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerName="ceilometer-central-agent" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.577308 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerName="ceilometer-central-agent" Dec 09 11:56:41 crc kubenswrapper[4745]: E1209 11:56:41.577328 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerName="proxy-httpd" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.577336 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerName="proxy-httpd" Dec 09 11:56:41 crc kubenswrapper[4745]: E1209 11:56:41.577351 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7710bc2-5cca-45ef-8464-8e02aa75adc2" containerName="nova-api-api" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.577357 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7710bc2-5cca-45ef-8464-8e02aa75adc2" containerName="nova-api-api" Dec 09 11:56:41 crc kubenswrapper[4745]: E1209 11:56:41.577383 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7710bc2-5cca-45ef-8464-8e02aa75adc2" containerName="nova-api-log" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.577390 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7710bc2-5cca-45ef-8464-8e02aa75adc2" containerName="nova-api-log" Dec 09 11:56:41 crc kubenswrapper[4745]: E1209 11:56:41.577403 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerName="sg-core" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.577411 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerName="sg-core" Dec 09 11:56:41 crc kubenswrapper[4745]: E1209 11:56:41.577420 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerName="ceilometer-notification-agent" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.577427 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerName="ceilometer-notification-agent" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.578100 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7710bc2-5cca-45ef-8464-8e02aa75adc2" containerName="nova-api-log" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.578128 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerName="ceilometer-central-agent" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.578144 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7710bc2-5cca-45ef-8464-8e02aa75adc2" containerName="nova-api-api" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.578160 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerName="proxy-httpd" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.578174 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerName="ceilometer-notification-agent" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.578183 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" containerName="sg-core" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.580146 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.580326 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.584267 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.584430 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.584496 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.610949 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.625955 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.627935 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-config-data\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.627986 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-public-tls-certs\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.628086 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.628160 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzrsr\" (UniqueName: \"kubernetes.io/projected/e32e826d-c08c-4308-9015-696ed1413663-kube-api-access-pzrsr\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.628213 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.628238 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e32e826d-c08c-4308-9015-696ed1413663-logs\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.640162 4745 scope.go:117] "RemoveContainer" containerID="d1c155d532305ad905bd0c171b9255b26f9e630f16a98e235817dd234795628e" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.655116 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.660977 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.664398 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.667483 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.668099 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.675929 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.677166 4745 scope.go:117] "RemoveContainer" containerID="d822cbfb8de56e4832091c0de915519951a0cddf7692f0d2f792281dc1dfdbfb" Dec 09 11:56:41 crc kubenswrapper[4745]: E1209 11:56:41.680153 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d822cbfb8de56e4832091c0de915519951a0cddf7692f0d2f792281dc1dfdbfb\": container with ID starting with d822cbfb8de56e4832091c0de915519951a0cddf7692f0d2f792281dc1dfdbfb not found: ID does not exist" containerID="d822cbfb8de56e4832091c0de915519951a0cddf7692f0d2f792281dc1dfdbfb" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.680207 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d822cbfb8de56e4832091c0de915519951a0cddf7692f0d2f792281dc1dfdbfb"} err="failed to get container status \"d822cbfb8de56e4832091c0de915519951a0cddf7692f0d2f792281dc1dfdbfb\": rpc error: code = NotFound desc = could not find container \"d822cbfb8de56e4832091c0de915519951a0cddf7692f0d2f792281dc1dfdbfb\": container with ID starting with d822cbfb8de56e4832091c0de915519951a0cddf7692f0d2f792281dc1dfdbfb not found: ID does not exist" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.680241 4745 scope.go:117] "RemoveContainer" containerID="8b241e069440569059ca7a87868e57cca5ee19b7640908cc49fc6f903154c8a2" Dec 09 11:56:41 crc kubenswrapper[4745]: E1209 11:56:41.683950 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b241e069440569059ca7a87868e57cca5ee19b7640908cc49fc6f903154c8a2\": container with ID starting with 8b241e069440569059ca7a87868e57cca5ee19b7640908cc49fc6f903154c8a2 not found: ID does not exist" containerID="8b241e069440569059ca7a87868e57cca5ee19b7640908cc49fc6f903154c8a2" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.684008 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b241e069440569059ca7a87868e57cca5ee19b7640908cc49fc6f903154c8a2"} err="failed to get container status \"8b241e069440569059ca7a87868e57cca5ee19b7640908cc49fc6f903154c8a2\": rpc error: code = NotFound desc = could not find container \"8b241e069440569059ca7a87868e57cca5ee19b7640908cc49fc6f903154c8a2\": container with ID starting with 8b241e069440569059ca7a87868e57cca5ee19b7640908cc49fc6f903154c8a2 not found: ID does not exist" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.684041 4745 scope.go:117] "RemoveContainer" containerID="722a6efee37e4f56315328c5153e7ea63c736f957ce596014cedd3fd3fff3983" Dec 09 11:56:41 crc kubenswrapper[4745]: E1209 11:56:41.684352 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"722a6efee37e4f56315328c5153e7ea63c736f957ce596014cedd3fd3fff3983\": container with ID starting with 722a6efee37e4f56315328c5153e7ea63c736f957ce596014cedd3fd3fff3983 not found: ID does not exist" containerID="722a6efee37e4f56315328c5153e7ea63c736f957ce596014cedd3fd3fff3983" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.684374 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"722a6efee37e4f56315328c5153e7ea63c736f957ce596014cedd3fd3fff3983"} err="failed to get container status \"722a6efee37e4f56315328c5153e7ea63c736f957ce596014cedd3fd3fff3983\": rpc error: code = NotFound desc = could not find container \"722a6efee37e4f56315328c5153e7ea63c736f957ce596014cedd3fd3fff3983\": container with ID starting with 722a6efee37e4f56315328c5153e7ea63c736f957ce596014cedd3fd3fff3983 not found: ID does not exist" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.684387 4745 scope.go:117] "RemoveContainer" containerID="d1c155d532305ad905bd0c171b9255b26f9e630f16a98e235817dd234795628e" Dec 09 11:56:41 crc kubenswrapper[4745]: E1209 11:56:41.684954 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1c155d532305ad905bd0c171b9255b26f9e630f16a98e235817dd234795628e\": container with ID starting with d1c155d532305ad905bd0c171b9255b26f9e630f16a98e235817dd234795628e not found: ID does not exist" containerID="d1c155d532305ad905bd0c171b9255b26f9e630f16a98e235817dd234795628e" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.685018 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1c155d532305ad905bd0c171b9255b26f9e630f16a98e235817dd234795628e"} err="failed to get container status \"d1c155d532305ad905bd0c171b9255b26f9e630f16a98e235817dd234795628e\": rpc error: code = NotFound desc = could not find container \"d1c155d532305ad905bd0c171b9255b26f9e630f16a98e235817dd234795628e\": container with ID starting with d1c155d532305ad905bd0c171b9255b26f9e630f16a98e235817dd234795628e not found: ID does not exist" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.730800 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-config-data\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.730847 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-public-tls-certs\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.730941 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.731000 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.731025 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.731063 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-scripts\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.731156 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-log-httpd\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.731188 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzrsr\" (UniqueName: \"kubernetes.io/projected/e32e826d-c08c-4308-9015-696ed1413663-kube-api-access-pzrsr\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.731227 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.731256 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzngt\" (UniqueName: \"kubernetes.io/projected/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-kube-api-access-tzngt\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.731282 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-config-data\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.731313 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e32e826d-c08c-4308-9015-696ed1413663-logs\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.731367 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.731421 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-run-httpd\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.734067 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e32e826d-c08c-4308-9015-696ed1413663-logs\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.743719 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.748535 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.748591 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-public-tls-certs\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.758042 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzrsr\" (UniqueName: \"kubernetes.io/projected/e32e826d-c08c-4308-9015-696ed1413663-kube-api-access-pzrsr\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.777363 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-config-data\") pod \"nova-api-0\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.779753 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-dhdw4"] Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.781198 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dhdw4" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.796297 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.796704 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.806255 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dhdw4"] Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.835432 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzngt\" (UniqueName: \"kubernetes.io/projected/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-kube-api-access-tzngt\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.835486 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-config-data\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.835560 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.835601 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-scripts\") pod \"nova-cell1-cell-mapping-dhdw4\" (UID: \"26c851b2-74b9-43d9-82be-3ee896408d78\") " pod="openstack/nova-cell1-cell-mapping-dhdw4" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.835625 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-run-httpd\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.835679 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.835698 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-config-data\") pod \"nova-cell1-cell-mapping-dhdw4\" (UID: \"26c851b2-74b9-43d9-82be-3ee896408d78\") " pod="openstack/nova-cell1-cell-mapping-dhdw4" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.835740 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.835765 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-scripts\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.835790 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-log-httpd\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.835807 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dhdw4\" (UID: \"26c851b2-74b9-43d9-82be-3ee896408d78\") " pod="openstack/nova-cell1-cell-mapping-dhdw4" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.835830 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pf9m\" (UniqueName: \"kubernetes.io/projected/26c851b2-74b9-43d9-82be-3ee896408d78-kube-api-access-4pf9m\") pod \"nova-cell1-cell-mapping-dhdw4\" (UID: \"26c851b2-74b9-43d9-82be-3ee896408d78\") " pod="openstack/nova-cell1-cell-mapping-dhdw4" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.837471 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-run-httpd\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.842425 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-log-httpd\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.842925 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-scripts\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.843292 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.844596 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-config-data\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.855669 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.857809 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzngt\" (UniqueName: \"kubernetes.io/projected/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-kube-api-access-tzngt\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.859145 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " pod="openstack/ceilometer-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.937862 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-scripts\") pod \"nova-cell1-cell-mapping-dhdw4\" (UID: \"26c851b2-74b9-43d9-82be-3ee896408d78\") " pod="openstack/nova-cell1-cell-mapping-dhdw4" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.938302 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-config-data\") pod \"nova-cell1-cell-mapping-dhdw4\" (UID: \"26c851b2-74b9-43d9-82be-3ee896408d78\") " pod="openstack/nova-cell1-cell-mapping-dhdw4" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.938441 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dhdw4\" (UID: \"26c851b2-74b9-43d9-82be-3ee896408d78\") " pod="openstack/nova-cell1-cell-mapping-dhdw4" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.938444 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.938567 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pf9m\" (UniqueName: \"kubernetes.io/projected/26c851b2-74b9-43d9-82be-3ee896408d78-kube-api-access-4pf9m\") pod \"nova-cell1-cell-mapping-dhdw4\" (UID: \"26c851b2-74b9-43d9-82be-3ee896408d78\") " pod="openstack/nova-cell1-cell-mapping-dhdw4" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.942539 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-config-data\") pod \"nova-cell1-cell-mapping-dhdw4\" (UID: \"26c851b2-74b9-43d9-82be-3ee896408d78\") " pod="openstack/nova-cell1-cell-mapping-dhdw4" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.942606 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dhdw4\" (UID: \"26c851b2-74b9-43d9-82be-3ee896408d78\") " pod="openstack/nova-cell1-cell-mapping-dhdw4" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.944237 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-scripts\") pod \"nova-cell1-cell-mapping-dhdw4\" (UID: \"26c851b2-74b9-43d9-82be-3ee896408d78\") " pod="openstack/nova-cell1-cell-mapping-dhdw4" Dec 09 11:56:41 crc kubenswrapper[4745]: I1209 11:56:41.958417 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pf9m\" (UniqueName: \"kubernetes.io/projected/26c851b2-74b9-43d9-82be-3ee896408d78-kube-api-access-4pf9m\") pod \"nova-cell1-cell-mapping-dhdw4\" (UID: \"26c851b2-74b9-43d9-82be-3ee896408d78\") " pod="openstack/nova-cell1-cell-mapping-dhdw4" Dec 09 11:56:42 crc kubenswrapper[4745]: I1209 11:56:42.005993 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dhdw4" Dec 09 11:56:42 crc kubenswrapper[4745]: I1209 11:56:42.021465 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:56:42 crc kubenswrapper[4745]: I1209 11:56:42.506013 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:56:42 crc kubenswrapper[4745]: W1209 11:56:42.512838 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode32e826d_c08c_4308_9015_696ed1413663.slice/crio-f16800d04724f0e952322eec9f84212248723163a2e2c73c044ddae852e4d52b WatchSource:0}: Error finding container f16800d04724f0e952322eec9f84212248723163a2e2c73c044ddae852e4d52b: Status 404 returned error can't find the container with id f16800d04724f0e952322eec9f84212248723163a2e2c73c044ddae852e4d52b Dec 09 11:56:42 crc kubenswrapper[4745]: W1209 11:56:42.621556 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26c851b2_74b9_43d9_82be_3ee896408d78.slice/crio-5ae5ccbbf525530f15fed77304e97da0ae9fd09f8c1d9dae51161a66007b5e87 WatchSource:0}: Error finding container 5ae5ccbbf525530f15fed77304e97da0ae9fd09f8c1d9dae51161a66007b5e87: Status 404 returned error can't find the container with id 5ae5ccbbf525530f15fed77304e97da0ae9fd09f8c1d9dae51161a66007b5e87 Dec 09 11:56:42 crc kubenswrapper[4745]: I1209 11:56:42.627956 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dhdw4"] Dec 09 11:56:42 crc kubenswrapper[4745]: I1209 11:56:42.720771 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:56:42 crc kubenswrapper[4745]: I1209 11:56:42.879481 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:56:42 crc kubenswrapper[4745]: I1209 11:56:42.971113 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bfb54f9b5-ln8w2"] Dec 09 11:56:42 crc kubenswrapper[4745]: I1209 11:56:42.971421 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" podUID="943383c2-4d4b-47bd-bb39-1d671795e573" containerName="dnsmasq-dns" containerID="cri-o://76271e94b457866083056a11543c6a26ba1888d918c47a9aedefee7c65b6293d" gracePeriod=10 Dec 09 11:56:43 crc kubenswrapper[4745]: I1209 11:56:43.514266 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e32e826d-c08c-4308-9015-696ed1413663","Type":"ContainerStarted","Data":"b3582d79abf0111444fa1e04343e911247270c4e59ad138b4d72b67ddc80ca87"} Dec 09 11:56:43 crc kubenswrapper[4745]: I1209 11:56:43.514798 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e32e826d-c08c-4308-9015-696ed1413663","Type":"ContainerStarted","Data":"f16800d04724f0e952322eec9f84212248723163a2e2c73c044ddae852e4d52b"} Dec 09 11:56:43 crc kubenswrapper[4745]: I1209 11:56:43.535415 4745 generic.go:334] "Generic (PLEG): container finished" podID="943383c2-4d4b-47bd-bb39-1d671795e573" containerID="76271e94b457866083056a11543c6a26ba1888d918c47a9aedefee7c65b6293d" exitCode=0 Dec 09 11:56:43 crc kubenswrapper[4745]: I1209 11:56:43.535587 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" event={"ID":"943383c2-4d4b-47bd-bb39-1d671795e573","Type":"ContainerDied","Data":"76271e94b457866083056a11543c6a26ba1888d918c47a9aedefee7c65b6293d"} Dec 09 11:56:43 crc kubenswrapper[4745]: I1209 11:56:43.546128 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dhdw4" event={"ID":"26c851b2-74b9-43d9-82be-3ee896408d78","Type":"ContainerStarted","Data":"d2d724612e33be6891fca27b52fe886988ad84a0f8f9f933dfaa7cdd089d1f93"} Dec 09 11:56:43 crc kubenswrapper[4745]: I1209 11:56:43.546244 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dhdw4" event={"ID":"26c851b2-74b9-43d9-82be-3ee896408d78","Type":"ContainerStarted","Data":"5ae5ccbbf525530f15fed77304e97da0ae9fd09f8c1d9dae51161a66007b5e87"} Dec 09 11:56:43 crc kubenswrapper[4745]: I1209 11:56:43.576154 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" podUID="943383c2-4d4b-47bd-bb39-1d671795e573" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.188:5353: connect: connection refused" Dec 09 11:56:43 crc kubenswrapper[4745]: I1209 11:56:43.594364 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4010d862-1eb2-492d-98c6-1cf34f133ffb" path="/var/lib/kubelet/pods/4010d862-1eb2-492d-98c6-1cf34f133ffb/volumes" Dec 09 11:56:43 crc kubenswrapper[4745]: I1209 11:56:43.595373 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41d93c20-efbc-41c3-bea1-7e7dad1ae70d","Type":"ContainerStarted","Data":"58538038be508da88785afc2f41901e85da407d5338aff2923f4c3727138ffb2"} Dec 09 11:56:43 crc kubenswrapper[4745]: I1209 11:56:43.601104 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-dhdw4" podStartSLOduration=2.601069283 podStartE2EDuration="2.601069283s" podCreationTimestamp="2025-12-09 11:56:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:56:43.576493831 +0000 UTC m=+1490.401695365" watchObservedRunningTime="2025-12-09 11:56:43.601069283 +0000 UTC m=+1490.426270807" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.088447 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.206225 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-ovsdbserver-nb\") pod \"943383c2-4d4b-47bd-bb39-1d671795e573\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.206473 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-dns-svc\") pod \"943383c2-4d4b-47bd-bb39-1d671795e573\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.206620 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-dns-swift-storage-0\") pod \"943383c2-4d4b-47bd-bb39-1d671795e573\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.206702 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-ovsdbserver-sb\") pod \"943383c2-4d4b-47bd-bb39-1d671795e573\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.206947 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zppx2\" (UniqueName: \"kubernetes.io/projected/943383c2-4d4b-47bd-bb39-1d671795e573-kube-api-access-zppx2\") pod \"943383c2-4d4b-47bd-bb39-1d671795e573\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.206997 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-config\") pod \"943383c2-4d4b-47bd-bb39-1d671795e573\" (UID: \"943383c2-4d4b-47bd-bb39-1d671795e573\") " Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.227588 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943383c2-4d4b-47bd-bb39-1d671795e573-kube-api-access-zppx2" (OuterVolumeSpecName: "kube-api-access-zppx2") pod "943383c2-4d4b-47bd-bb39-1d671795e573" (UID: "943383c2-4d4b-47bd-bb39-1d671795e573"). InnerVolumeSpecName "kube-api-access-zppx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.290428 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "943383c2-4d4b-47bd-bb39-1d671795e573" (UID: "943383c2-4d4b-47bd-bb39-1d671795e573"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.299075 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "943383c2-4d4b-47bd-bb39-1d671795e573" (UID: "943383c2-4d4b-47bd-bb39-1d671795e573"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.310319 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zppx2\" (UniqueName: \"kubernetes.io/projected/943383c2-4d4b-47bd-bb39-1d671795e573-kube-api-access-zppx2\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.310373 4745 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.310393 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.311438 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "943383c2-4d4b-47bd-bb39-1d671795e573" (UID: "943383c2-4d4b-47bd-bb39-1d671795e573"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.312036 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "943383c2-4d4b-47bd-bb39-1d671795e573" (UID: "943383c2-4d4b-47bd-bb39-1d671795e573"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.328553 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-config" (OuterVolumeSpecName: "config") pod "943383c2-4d4b-47bd-bb39-1d671795e573" (UID: "943383c2-4d4b-47bd-bb39-1d671795e573"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.413232 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.413280 4745 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.413294 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/943383c2-4d4b-47bd-bb39-1d671795e573-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.588618 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" event={"ID":"943383c2-4d4b-47bd-bb39-1d671795e573","Type":"ContainerDied","Data":"dd07bf0a06c42cffdadf80a72d9647a1f53aa0fdf35828807e87ee742a13915e"} Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.588694 4745 scope.go:117] "RemoveContainer" containerID="76271e94b457866083056a11543c6a26ba1888d918c47a9aedefee7c65b6293d" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.588889 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bfb54f9b5-ln8w2" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.599768 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41d93c20-efbc-41c3-bea1-7e7dad1ae70d","Type":"ContainerStarted","Data":"5fd672e48430bca4d1c2a71daa80ccf27bd2fcb9360ab8501dfd3906ec36e53e"} Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.606670 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e32e826d-c08c-4308-9015-696ed1413663","Type":"ContainerStarted","Data":"7f357a77e3f90463d2c7d6a9332c000cdf394024ffee885639dea61cbd29f951"} Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.630980 4745 scope.go:117] "RemoveContainer" containerID="bb8d2b4217a393f956597ec47aee5cf1a2a5a42823f6874f39aab5832c31fef0" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.647726 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.6476750730000003 podStartE2EDuration="3.647675073s" podCreationTimestamp="2025-12-09 11:56:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:56:44.628123516 +0000 UTC m=+1491.453325040" watchObservedRunningTime="2025-12-09 11:56:44.647675073 +0000 UTC m=+1491.472876617" Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.677573 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bfb54f9b5-ln8w2"] Dec 09 11:56:44 crc kubenswrapper[4745]: I1209 11:56:44.687211 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bfb54f9b5-ln8w2"] Dec 09 11:56:45 crc kubenswrapper[4745]: I1209 11:56:45.568421 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="943383c2-4d4b-47bd-bb39-1d671795e573" path="/var/lib/kubelet/pods/943383c2-4d4b-47bd-bb39-1d671795e573/volumes" Dec 09 11:56:45 crc kubenswrapper[4745]: I1209 11:56:45.623018 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41d93c20-efbc-41c3-bea1-7e7dad1ae70d","Type":"ContainerStarted","Data":"c29448a85553503fc1fcb48e56752ab6bca8a6dd9078d1c9f160ca8db10b194d"} Dec 09 11:56:47 crc kubenswrapper[4745]: I1209 11:56:47.647310 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41d93c20-efbc-41c3-bea1-7e7dad1ae70d","Type":"ContainerStarted","Data":"2e2d1a1072494cde394a0d7349ffb4decaf89624f4008fb775c18a80d492e831"} Dec 09 11:56:49 crc kubenswrapper[4745]: I1209 11:56:49.684881 4745 generic.go:334] "Generic (PLEG): container finished" podID="26c851b2-74b9-43d9-82be-3ee896408d78" containerID="d2d724612e33be6891fca27b52fe886988ad84a0f8f9f933dfaa7cdd089d1f93" exitCode=0 Dec 09 11:56:49 crc kubenswrapper[4745]: I1209 11:56:49.685072 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dhdw4" event={"ID":"26c851b2-74b9-43d9-82be-3ee896408d78","Type":"ContainerDied","Data":"d2d724612e33be6891fca27b52fe886988ad84a0f8f9f933dfaa7cdd089d1f93"} Dec 09 11:56:49 crc kubenswrapper[4745]: I1209 11:56:49.691629 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41d93c20-efbc-41c3-bea1-7e7dad1ae70d","Type":"ContainerStarted","Data":"29da6f720b0b04eb76a45efbd214f1c1f0273bc06fc58a52d1909cd993ab7077"} Dec 09 11:56:49 crc kubenswrapper[4745]: I1209 11:56:49.691794 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 11:56:49 crc kubenswrapper[4745]: I1209 11:56:49.747179 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.543600033 podStartE2EDuration="8.747158461s" podCreationTimestamp="2025-12-09 11:56:41 +0000 UTC" firstStartedPulling="2025-12-09 11:56:42.728368322 +0000 UTC m=+1489.553569846" lastFinishedPulling="2025-12-09 11:56:48.93192676 +0000 UTC m=+1495.757128274" observedRunningTime="2025-12-09 11:56:49.743234115 +0000 UTC m=+1496.568435639" watchObservedRunningTime="2025-12-09 11:56:49.747158461 +0000 UTC m=+1496.572359985" Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.164164 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dhdw4" Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.239429 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pf9m\" (UniqueName: \"kubernetes.io/projected/26c851b2-74b9-43d9-82be-3ee896408d78-kube-api-access-4pf9m\") pod \"26c851b2-74b9-43d9-82be-3ee896408d78\" (UID: \"26c851b2-74b9-43d9-82be-3ee896408d78\") " Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.239496 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-config-data\") pod \"26c851b2-74b9-43d9-82be-3ee896408d78\" (UID: \"26c851b2-74b9-43d9-82be-3ee896408d78\") " Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.239568 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-scripts\") pod \"26c851b2-74b9-43d9-82be-3ee896408d78\" (UID: \"26c851b2-74b9-43d9-82be-3ee896408d78\") " Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.239740 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-combined-ca-bundle\") pod \"26c851b2-74b9-43d9-82be-3ee896408d78\" (UID: \"26c851b2-74b9-43d9-82be-3ee896408d78\") " Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.248890 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c851b2-74b9-43d9-82be-3ee896408d78-kube-api-access-4pf9m" (OuterVolumeSpecName: "kube-api-access-4pf9m") pod "26c851b2-74b9-43d9-82be-3ee896408d78" (UID: "26c851b2-74b9-43d9-82be-3ee896408d78"). InnerVolumeSpecName "kube-api-access-4pf9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.254840 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-scripts" (OuterVolumeSpecName: "scripts") pod "26c851b2-74b9-43d9-82be-3ee896408d78" (UID: "26c851b2-74b9-43d9-82be-3ee896408d78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.276066 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-config-data" (OuterVolumeSpecName: "config-data") pod "26c851b2-74b9-43d9-82be-3ee896408d78" (UID: "26c851b2-74b9-43d9-82be-3ee896408d78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.289793 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26c851b2-74b9-43d9-82be-3ee896408d78" (UID: "26c851b2-74b9-43d9-82be-3ee896408d78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.342334 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.342370 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pf9m\" (UniqueName: \"kubernetes.io/projected/26c851b2-74b9-43d9-82be-3ee896408d78-kube-api-access-4pf9m\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.342383 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.342390 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26c851b2-74b9-43d9-82be-3ee896408d78-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.715444 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dhdw4" event={"ID":"26c851b2-74b9-43d9-82be-3ee896408d78","Type":"ContainerDied","Data":"5ae5ccbbf525530f15fed77304e97da0ae9fd09f8c1d9dae51161a66007b5e87"} Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.715497 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dhdw4" Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.715545 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ae5ccbbf525530f15fed77304e97da0ae9fd09f8c1d9dae51161a66007b5e87" Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.924586 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.924962 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="65bb33f9-ecd8-4960-8cc3-e9537509e71f" containerName="nova-scheduler-scheduler" containerID="cri-o://e440aff32fcaec3a8114366fee841731f1fbf5657009fb4945aed3566c407e31" gracePeriod=30 Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.939103 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.939202 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 11:56:51 crc kubenswrapper[4745]: I1209 11:56:51.941278 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:56:52 crc kubenswrapper[4745]: I1209 11:56:52.093862 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:52 crc kubenswrapper[4745]: I1209 11:56:52.094458 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dadcda89-9dcd-479b-a3d2-ef0422f992c8" containerName="nova-metadata-log" containerID="cri-o://9416585f06086a9bb4c7aa5fd6561a97084403b00568331dbddc6bc7ed178be6" gracePeriod=30 Dec 09 11:56:52 crc kubenswrapper[4745]: I1209 11:56:52.094571 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dadcda89-9dcd-479b-a3d2-ef0422f992c8" containerName="nova-metadata-metadata" containerID="cri-o://11df3c618a85a65ff7e238affe01b3c177f9c30ad986c9484fb0ea61308590c3" gracePeriod=30 Dec 09 11:56:52 crc kubenswrapper[4745]: E1209 11:56:52.308637 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e440aff32fcaec3a8114366fee841731f1fbf5657009fb4945aed3566c407e31" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:56:52 crc kubenswrapper[4745]: E1209 11:56:52.310709 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e440aff32fcaec3a8114366fee841731f1fbf5657009fb4945aed3566c407e31" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:56:52 crc kubenswrapper[4745]: E1209 11:56:52.312084 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e440aff32fcaec3a8114366fee841731f1fbf5657009fb4945aed3566c407e31" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:56:52 crc kubenswrapper[4745]: E1209 11:56:52.312149 4745 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="65bb33f9-ecd8-4960-8cc3-e9537509e71f" containerName="nova-scheduler-scheduler" Dec 09 11:56:52 crc kubenswrapper[4745]: I1209 11:56:52.735787 4745 generic.go:334] "Generic (PLEG): container finished" podID="dadcda89-9dcd-479b-a3d2-ef0422f992c8" containerID="9416585f06086a9bb4c7aa5fd6561a97084403b00568331dbddc6bc7ed178be6" exitCode=143 Dec 09 11:56:52 crc kubenswrapper[4745]: I1209 11:56:52.735847 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dadcda89-9dcd-479b-a3d2-ef0422f992c8","Type":"ContainerDied","Data":"9416585f06086a9bb4c7aa5fd6561a97084403b00568331dbddc6bc7ed178be6"} Dec 09 11:56:52 crc kubenswrapper[4745]: I1209 11:56:52.736033 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e32e826d-c08c-4308-9015-696ed1413663" containerName="nova-api-log" containerID="cri-o://b3582d79abf0111444fa1e04343e911247270c4e59ad138b4d72b67ddc80ca87" gracePeriod=30 Dec 09 11:56:52 crc kubenswrapper[4745]: I1209 11:56:52.736113 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e32e826d-c08c-4308-9015-696ed1413663" containerName="nova-api-api" containerID="cri-o://7f357a77e3f90463d2c7d6a9332c000cdf394024ffee885639dea61cbd29f951" gracePeriod=30 Dec 09 11:56:52 crc kubenswrapper[4745]: I1209 11:56:52.742838 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e32e826d-c08c-4308-9015-696ed1413663" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": EOF" Dec 09 11:56:52 crc kubenswrapper[4745]: I1209 11:56:52.743047 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e32e826d-c08c-4308-9015-696ed1413663" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": EOF" Dec 09 11:56:53 crc kubenswrapper[4745]: I1209 11:56:53.750840 4745 generic.go:334] "Generic (PLEG): container finished" podID="e32e826d-c08c-4308-9015-696ed1413663" containerID="b3582d79abf0111444fa1e04343e911247270c4e59ad138b4d72b67ddc80ca87" exitCode=143 Dec 09 11:56:53 crc kubenswrapper[4745]: I1209 11:56:53.750978 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e32e826d-c08c-4308-9015-696ed1413663","Type":"ContainerDied","Data":"b3582d79abf0111444fa1e04343e911247270c4e59ad138b4d72b67ddc80ca87"} Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.227662 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="dadcda89-9dcd-479b-a3d2-ef0422f992c8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:47528->10.217.0.191:8775: read: connection reset by peer" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.227690 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="dadcda89-9dcd-479b-a3d2-ef0422f992c8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:47536->10.217.0.191:8775: read: connection reset by peer" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.475149 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.475210 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.727641 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.776604 4745 generic.go:334] "Generic (PLEG): container finished" podID="dadcda89-9dcd-479b-a3d2-ef0422f992c8" containerID="11df3c618a85a65ff7e238affe01b3c177f9c30ad986c9484fb0ea61308590c3" exitCode=0 Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.776656 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dadcda89-9dcd-479b-a3d2-ef0422f992c8","Type":"ContainerDied","Data":"11df3c618a85a65ff7e238affe01b3c177f9c30ad986c9484fb0ea61308590c3"} Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.776689 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dadcda89-9dcd-479b-a3d2-ef0422f992c8","Type":"ContainerDied","Data":"a8c82ae0674b271440193e8c7a827faf631ac270ac83e8d9c17cf755747b1a23"} Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.776708 4745 scope.go:117] "RemoveContainer" containerID="11df3c618a85a65ff7e238affe01b3c177f9c30ad986c9484fb0ea61308590c3" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.776860 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.805312 4745 scope.go:117] "RemoveContainer" containerID="9416585f06086a9bb4c7aa5fd6561a97084403b00568331dbddc6bc7ed178be6" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.825081 4745 scope.go:117] "RemoveContainer" containerID="11df3c618a85a65ff7e238affe01b3c177f9c30ad986c9484fb0ea61308590c3" Dec 09 11:56:55 crc kubenswrapper[4745]: E1209 11:56:55.825819 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11df3c618a85a65ff7e238affe01b3c177f9c30ad986c9484fb0ea61308590c3\": container with ID starting with 11df3c618a85a65ff7e238affe01b3c177f9c30ad986c9484fb0ea61308590c3 not found: ID does not exist" containerID="11df3c618a85a65ff7e238affe01b3c177f9c30ad986c9484fb0ea61308590c3" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.825896 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11df3c618a85a65ff7e238affe01b3c177f9c30ad986c9484fb0ea61308590c3"} err="failed to get container status \"11df3c618a85a65ff7e238affe01b3c177f9c30ad986c9484fb0ea61308590c3\": rpc error: code = NotFound desc = could not find container \"11df3c618a85a65ff7e238affe01b3c177f9c30ad986c9484fb0ea61308590c3\": container with ID starting with 11df3c618a85a65ff7e238affe01b3c177f9c30ad986c9484fb0ea61308590c3 not found: ID does not exist" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.825939 4745 scope.go:117] "RemoveContainer" containerID="9416585f06086a9bb4c7aa5fd6561a97084403b00568331dbddc6bc7ed178be6" Dec 09 11:56:55 crc kubenswrapper[4745]: E1209 11:56:55.826348 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9416585f06086a9bb4c7aa5fd6561a97084403b00568331dbddc6bc7ed178be6\": container with ID starting with 9416585f06086a9bb4c7aa5fd6561a97084403b00568331dbddc6bc7ed178be6 not found: ID does not exist" containerID="9416585f06086a9bb4c7aa5fd6561a97084403b00568331dbddc6bc7ed178be6" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.826374 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9416585f06086a9bb4c7aa5fd6561a97084403b00568331dbddc6bc7ed178be6"} err="failed to get container status \"9416585f06086a9bb4c7aa5fd6561a97084403b00568331dbddc6bc7ed178be6\": rpc error: code = NotFound desc = could not find container \"9416585f06086a9bb4c7aa5fd6561a97084403b00568331dbddc6bc7ed178be6\": container with ID starting with 9416585f06086a9bb4c7aa5fd6561a97084403b00568331dbddc6bc7ed178be6 not found: ID does not exist" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.861168 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-nova-metadata-tls-certs\") pod \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.861399 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcjfp\" (UniqueName: \"kubernetes.io/projected/dadcda89-9dcd-479b-a3d2-ef0422f992c8-kube-api-access-gcjfp\") pod \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.861668 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-combined-ca-bundle\") pod \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.862454 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dadcda89-9dcd-479b-a3d2-ef0422f992c8-logs\") pod \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.862908 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dadcda89-9dcd-479b-a3d2-ef0422f992c8-logs" (OuterVolumeSpecName: "logs") pod "dadcda89-9dcd-479b-a3d2-ef0422f992c8" (UID: "dadcda89-9dcd-479b-a3d2-ef0422f992c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.863402 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-config-data\") pod \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\" (UID: \"dadcda89-9dcd-479b-a3d2-ef0422f992c8\") " Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.865413 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dadcda89-9dcd-479b-a3d2-ef0422f992c8-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.871248 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dadcda89-9dcd-479b-a3d2-ef0422f992c8-kube-api-access-gcjfp" (OuterVolumeSpecName: "kube-api-access-gcjfp") pod "dadcda89-9dcd-479b-a3d2-ef0422f992c8" (UID: "dadcda89-9dcd-479b-a3d2-ef0422f992c8"). InnerVolumeSpecName "kube-api-access-gcjfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.900592 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-config-data" (OuterVolumeSpecName: "config-data") pod "dadcda89-9dcd-479b-a3d2-ef0422f992c8" (UID: "dadcda89-9dcd-479b-a3d2-ef0422f992c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.913766 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dadcda89-9dcd-479b-a3d2-ef0422f992c8" (UID: "dadcda89-9dcd-479b-a3d2-ef0422f992c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.943680 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "dadcda89-9dcd-479b-a3d2-ef0422f992c8" (UID: "dadcda89-9dcd-479b-a3d2-ef0422f992c8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.968158 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.968445 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.968524 4745 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dadcda89-9dcd-479b-a3d2-ef0422f992c8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:55 crc kubenswrapper[4745]: I1209 11:56:55.968588 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcjfp\" (UniqueName: \"kubernetes.io/projected/dadcda89-9dcd-479b-a3d2-ef0422f992c8-kube-api-access-gcjfp\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.121634 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.132701 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.163682 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:56 crc kubenswrapper[4745]: E1209 11:56:56.164297 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943383c2-4d4b-47bd-bb39-1d671795e573" containerName="init" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.164319 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="943383c2-4d4b-47bd-bb39-1d671795e573" containerName="init" Dec 09 11:56:56 crc kubenswrapper[4745]: E1209 11:56:56.164331 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943383c2-4d4b-47bd-bb39-1d671795e573" containerName="dnsmasq-dns" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.164339 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="943383c2-4d4b-47bd-bb39-1d671795e573" containerName="dnsmasq-dns" Dec 09 11:56:56 crc kubenswrapper[4745]: E1209 11:56:56.164354 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c851b2-74b9-43d9-82be-3ee896408d78" containerName="nova-manage" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.164362 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c851b2-74b9-43d9-82be-3ee896408d78" containerName="nova-manage" Dec 09 11:56:56 crc kubenswrapper[4745]: E1209 11:56:56.164392 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dadcda89-9dcd-479b-a3d2-ef0422f992c8" containerName="nova-metadata-metadata" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.164399 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadcda89-9dcd-479b-a3d2-ef0422f992c8" containerName="nova-metadata-metadata" Dec 09 11:56:56 crc kubenswrapper[4745]: E1209 11:56:56.164406 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dadcda89-9dcd-479b-a3d2-ef0422f992c8" containerName="nova-metadata-log" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.164413 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadcda89-9dcd-479b-a3d2-ef0422f992c8" containerName="nova-metadata-log" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.164619 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="dadcda89-9dcd-479b-a3d2-ef0422f992c8" containerName="nova-metadata-log" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.164635 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c851b2-74b9-43d9-82be-3ee896408d78" containerName="nova-manage" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.164651 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="dadcda89-9dcd-479b-a3d2-ef0422f992c8" containerName="nova-metadata-metadata" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.164661 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="943383c2-4d4b-47bd-bb39-1d671795e573" containerName="dnsmasq-dns" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.165947 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.170315 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.174107 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.185701 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.277580 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dszll\" (UniqueName: \"kubernetes.io/projected/c180eac0-e93f-4067-ba6d-32a023f424e6-kube-api-access-dszll\") pod \"nova-metadata-0\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.277703 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.277790 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c180eac0-e93f-4067-ba6d-32a023f424e6-logs\") pod \"nova-metadata-0\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.277815 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-config-data\") pod \"nova-metadata-0\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.277903 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.380580 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.380725 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dszll\" (UniqueName: \"kubernetes.io/projected/c180eac0-e93f-4067-ba6d-32a023f424e6-kube-api-access-dszll\") pod \"nova-metadata-0\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.380792 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.380832 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c180eac0-e93f-4067-ba6d-32a023f424e6-logs\") pod \"nova-metadata-0\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.380855 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-config-data\") pod \"nova-metadata-0\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.382679 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c180eac0-e93f-4067-ba6d-32a023f424e6-logs\") pod \"nova-metadata-0\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.386554 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.387056 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-config-data\") pod \"nova-metadata-0\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.393592 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.404765 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dszll\" (UniqueName: \"kubernetes.io/projected/c180eac0-e93f-4067-ba6d-32a023f424e6-kube-api-access-dszll\") pod \"nova-metadata-0\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.500524 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.794226 4745 generic.go:334] "Generic (PLEG): container finished" podID="65bb33f9-ecd8-4960-8cc3-e9537509e71f" containerID="e440aff32fcaec3a8114366fee841731f1fbf5657009fb4945aed3566c407e31" exitCode=0 Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.794619 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"65bb33f9-ecd8-4960-8cc3-e9537509e71f","Type":"ContainerDied","Data":"e440aff32fcaec3a8114366fee841731f1fbf5657009fb4945aed3566c407e31"} Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.951220 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.992554 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bb33f9-ecd8-4960-8cc3-e9537509e71f-combined-ca-bundle\") pod \"65bb33f9-ecd8-4960-8cc3-e9537509e71f\" (UID: \"65bb33f9-ecd8-4960-8cc3-e9537509e71f\") " Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.992665 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bb33f9-ecd8-4960-8cc3-e9537509e71f-config-data\") pod \"65bb33f9-ecd8-4960-8cc3-e9537509e71f\" (UID: \"65bb33f9-ecd8-4960-8cc3-e9537509e71f\") " Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.992690 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-827g8\" (UniqueName: \"kubernetes.io/projected/65bb33f9-ecd8-4960-8cc3-e9537509e71f-kube-api-access-827g8\") pod \"65bb33f9-ecd8-4960-8cc3-e9537509e71f\" (UID: \"65bb33f9-ecd8-4960-8cc3-e9537509e71f\") " Dec 09 11:56:56 crc kubenswrapper[4745]: I1209 11:56:56.998593 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65bb33f9-ecd8-4960-8cc3-e9537509e71f-kube-api-access-827g8" (OuterVolumeSpecName: "kube-api-access-827g8") pod "65bb33f9-ecd8-4960-8cc3-e9537509e71f" (UID: "65bb33f9-ecd8-4960-8cc3-e9537509e71f"). InnerVolumeSpecName "kube-api-access-827g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.028661 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65bb33f9-ecd8-4960-8cc3-e9537509e71f-config-data" (OuterVolumeSpecName: "config-data") pod "65bb33f9-ecd8-4960-8cc3-e9537509e71f" (UID: "65bb33f9-ecd8-4960-8cc3-e9537509e71f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.030206 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65bb33f9-ecd8-4960-8cc3-e9537509e71f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65bb33f9-ecd8-4960-8cc3-e9537509e71f" (UID: "65bb33f9-ecd8-4960-8cc3-e9537509e71f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.073397 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.096723 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bb33f9-ecd8-4960-8cc3-e9537509e71f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.096789 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bb33f9-ecd8-4960-8cc3-e9537509e71f-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.096803 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-827g8\" (UniqueName: \"kubernetes.io/projected/65bb33f9-ecd8-4960-8cc3-e9537509e71f-kube-api-access-827g8\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.569631 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dadcda89-9dcd-479b-a3d2-ef0422f992c8" path="/var/lib/kubelet/pods/dadcda89-9dcd-479b-a3d2-ef0422f992c8/volumes" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.805584 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c180eac0-e93f-4067-ba6d-32a023f424e6","Type":"ContainerStarted","Data":"2b371925777f6434276ab7a74c14a8e465cf580329fd7735e425f6016bedfb3b"} Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.805652 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c180eac0-e93f-4067-ba6d-32a023f424e6","Type":"ContainerStarted","Data":"54a2edbbc1397cc5ce4a40d2a57015e139f31d6f9f4b30325e3a940d377baabd"} Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.805668 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c180eac0-e93f-4067-ba6d-32a023f424e6","Type":"ContainerStarted","Data":"de2677d35a9062cecaef36be6523796776d3310b2806e5606298434ebb2fe652"} Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.808488 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"65bb33f9-ecd8-4960-8cc3-e9537509e71f","Type":"ContainerDied","Data":"4417887b62a16c9e928d38063f358e137ba25d2c6c20b41dadfb02eff67c05a8"} Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.808565 4745 scope.go:117] "RemoveContainer" containerID="e440aff32fcaec3a8114366fee841731f1fbf5657009fb4945aed3566c407e31" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.808718 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.842471 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.8424508450000001 podStartE2EDuration="1.842450845s" podCreationTimestamp="2025-12-09 11:56:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:56:57.833426802 +0000 UTC m=+1504.658628326" watchObservedRunningTime="2025-12-09 11:56:57.842450845 +0000 UTC m=+1504.667652369" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.864876 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.872799 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.888110 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:56:57 crc kubenswrapper[4745]: E1209 11:56:57.888792 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bb33f9-ecd8-4960-8cc3-e9537509e71f" containerName="nova-scheduler-scheduler" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.888818 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bb33f9-ecd8-4960-8cc3-e9537509e71f" containerName="nova-scheduler-scheduler" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.889117 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="65bb33f9-ecd8-4960-8cc3-e9537509e71f" containerName="nova-scheduler-scheduler" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.890210 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.894524 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.908495 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.928459 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392b878a-37ba-4887-a699-672c8b92e947-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"392b878a-37ba-4887-a699-672c8b92e947\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.928631 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm98c\" (UniqueName: \"kubernetes.io/projected/392b878a-37ba-4887-a699-672c8b92e947-kube-api-access-mm98c\") pod \"nova-scheduler-0\" (UID: \"392b878a-37ba-4887-a699-672c8b92e947\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:57 crc kubenswrapper[4745]: I1209 11:56:57.928746 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392b878a-37ba-4887-a699-672c8b92e947-config-data\") pod \"nova-scheduler-0\" (UID: \"392b878a-37ba-4887-a699-672c8b92e947\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.031479 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm98c\" (UniqueName: \"kubernetes.io/projected/392b878a-37ba-4887-a699-672c8b92e947-kube-api-access-mm98c\") pod \"nova-scheduler-0\" (UID: \"392b878a-37ba-4887-a699-672c8b92e947\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.031662 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392b878a-37ba-4887-a699-672c8b92e947-config-data\") pod \"nova-scheduler-0\" (UID: \"392b878a-37ba-4887-a699-672c8b92e947\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.031876 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392b878a-37ba-4887-a699-672c8b92e947-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"392b878a-37ba-4887-a699-672c8b92e947\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.038643 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392b878a-37ba-4887-a699-672c8b92e947-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"392b878a-37ba-4887-a699-672c8b92e947\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.046296 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392b878a-37ba-4887-a699-672c8b92e947-config-data\") pod \"nova-scheduler-0\" (UID: \"392b878a-37ba-4887-a699-672c8b92e947\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.051897 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm98c\" (UniqueName: \"kubernetes.io/projected/392b878a-37ba-4887-a699-672c8b92e947-kube-api-access-mm98c\") pod \"nova-scheduler-0\" (UID: \"392b878a-37ba-4887-a699-672c8b92e947\") " pod="openstack/nova-scheduler-0" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.224297 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.427619 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ftdfr"] Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.432243 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.443419 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ftdfr"] Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.545754 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-utilities\") pod \"community-operators-ftdfr\" (UID: \"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043\") " pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.545826 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-catalog-content\") pod \"community-operators-ftdfr\" (UID: \"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043\") " pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.545961 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lxz2\" (UniqueName: \"kubernetes.io/projected/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-kube-api-access-5lxz2\") pod \"community-operators-ftdfr\" (UID: \"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043\") " pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.650752 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lxz2\" (UniqueName: \"kubernetes.io/projected/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-kube-api-access-5lxz2\") pod \"community-operators-ftdfr\" (UID: \"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043\") " pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.652027 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-utilities\") pod \"community-operators-ftdfr\" (UID: \"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043\") " pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.652107 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-catalog-content\") pod \"community-operators-ftdfr\" (UID: \"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043\") " pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.652977 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-catalog-content\") pod \"community-operators-ftdfr\" (UID: \"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043\") " pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.657225 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-utilities\") pod \"community-operators-ftdfr\" (UID: \"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043\") " pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.680937 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lxz2\" (UniqueName: \"kubernetes.io/projected/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-kube-api-access-5lxz2\") pod \"community-operators-ftdfr\" (UID: \"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043\") " pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.776259 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.784157 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:56:58 crc kubenswrapper[4745]: W1209 11:56:58.784876 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod392b878a_37ba_4887_a699_672c8b92e947.slice/crio-7adfd565aa074e4b03e26470a07850ac0acc8fcf3b826ab83943ab8c20b30f75 WatchSource:0}: Error finding container 7adfd565aa074e4b03e26470a07850ac0acc8fcf3b826ab83943ab8c20b30f75: Status 404 returned error can't find the container with id 7adfd565aa074e4b03e26470a07850ac0acc8fcf3b826ab83943ab8c20b30f75 Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.831676 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"392b878a-37ba-4887-a699-672c8b92e947","Type":"ContainerStarted","Data":"7adfd565aa074e4b03e26470a07850ac0acc8fcf3b826ab83943ab8c20b30f75"} Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.850104 4745 generic.go:334] "Generic (PLEG): container finished" podID="e32e826d-c08c-4308-9015-696ed1413663" containerID="7f357a77e3f90463d2c7d6a9332c000cdf394024ffee885639dea61cbd29f951" exitCode=0 Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.851019 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e32e826d-c08c-4308-9015-696ed1413663","Type":"ContainerDied","Data":"7f357a77e3f90463d2c7d6a9332c000cdf394024ffee885639dea61cbd29f951"} Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.943102 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.959298 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzrsr\" (UniqueName: \"kubernetes.io/projected/e32e826d-c08c-4308-9015-696ed1413663-kube-api-access-pzrsr\") pod \"e32e826d-c08c-4308-9015-696ed1413663\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.960611 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-config-data\") pod \"e32e826d-c08c-4308-9015-696ed1413663\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.960664 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-combined-ca-bundle\") pod \"e32e826d-c08c-4308-9015-696ed1413663\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.960691 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-public-tls-certs\") pod \"e32e826d-c08c-4308-9015-696ed1413663\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.960723 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e32e826d-c08c-4308-9015-696ed1413663-logs\") pod \"e32e826d-c08c-4308-9015-696ed1413663\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.960839 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-internal-tls-certs\") pod \"e32e826d-c08c-4308-9015-696ed1413663\" (UID: \"e32e826d-c08c-4308-9015-696ed1413663\") " Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.962397 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e32e826d-c08c-4308-9015-696ed1413663-logs" (OuterVolumeSpecName: "logs") pod "e32e826d-c08c-4308-9015-696ed1413663" (UID: "e32e826d-c08c-4308-9015-696ed1413663"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:56:58 crc kubenswrapper[4745]: I1209 11:56:58.967375 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e32e826d-c08c-4308-9015-696ed1413663-kube-api-access-pzrsr" (OuterVolumeSpecName: "kube-api-access-pzrsr") pod "e32e826d-c08c-4308-9015-696ed1413663" (UID: "e32e826d-c08c-4308-9015-696ed1413663"). InnerVolumeSpecName "kube-api-access-pzrsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.084958 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e32e826d-c08c-4308-9015-696ed1413663-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.084994 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzrsr\" (UniqueName: \"kubernetes.io/projected/e32e826d-c08c-4308-9015-696ed1413663-kube-api-access-pzrsr\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.112565 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-config-data" (OuterVolumeSpecName: "config-data") pod "e32e826d-c08c-4308-9015-696ed1413663" (UID: "e32e826d-c08c-4308-9015-696ed1413663"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.117779 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e32e826d-c08c-4308-9015-696ed1413663" (UID: "e32e826d-c08c-4308-9015-696ed1413663"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.161689 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e32e826d-c08c-4308-9015-696ed1413663" (UID: "e32e826d-c08c-4308-9015-696ed1413663"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.180631 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e32e826d-c08c-4308-9015-696ed1413663" (UID: "e32e826d-c08c-4308-9015-696ed1413663"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.187043 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.187436 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.187452 4745 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.187461 4745 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e32e826d-c08c-4308-9015-696ed1413663-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.433171 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ftdfr"] Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.570984 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65bb33f9-ecd8-4960-8cc3-e9537509e71f" path="/var/lib/kubelet/pods/65bb33f9-ecd8-4960-8cc3-e9537509e71f/volumes" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.862856 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"392b878a-37ba-4887-a699-672c8b92e947","Type":"ContainerStarted","Data":"4362cdcbe28cc87f9a30f3b6d968ff0b65e8991acf95fb8eeec6d303cfb21ea1"} Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.867591 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e32e826d-c08c-4308-9015-696ed1413663","Type":"ContainerDied","Data":"f16800d04724f0e952322eec9f84212248723163a2e2c73c044ddae852e4d52b"} Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.867695 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.867862 4745 scope.go:117] "RemoveContainer" containerID="7f357a77e3f90463d2c7d6a9332c000cdf394024ffee885639dea61cbd29f951" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.872346 4745 generic.go:334] "Generic (PLEG): container finished" podID="ad6a4ce9-ca82-4c63-bafe-706a3b2b5043" containerID="da5511b3949270908589006e86fc679c4fcea45ad940ca9f4100818479e2db77" exitCode=0 Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.872394 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftdfr" event={"ID":"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043","Type":"ContainerDied","Data":"da5511b3949270908589006e86fc679c4fcea45ad940ca9f4100818479e2db77"} Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.872424 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftdfr" event={"ID":"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043","Type":"ContainerStarted","Data":"ed8baf5961d1c1768ae104a1604b08c80d77bf5c93c1b70c27eacc559d15bcb7"} Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.892442 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.892418339 podStartE2EDuration="2.892418339s" podCreationTimestamp="2025-12-09 11:56:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:56:59.881682789 +0000 UTC m=+1506.706884333" watchObservedRunningTime="2025-12-09 11:56:59.892418339 +0000 UTC m=+1506.717619863" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.909868 4745 scope.go:117] "RemoveContainer" containerID="b3582d79abf0111444fa1e04343e911247270c4e59ad138b4d72b67ddc80ca87" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.949233 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.958986 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.972660 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 11:56:59 crc kubenswrapper[4745]: E1209 11:56:59.973350 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e32e826d-c08c-4308-9015-696ed1413663" containerName="nova-api-log" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.973388 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e32e826d-c08c-4308-9015-696ed1413663" containerName="nova-api-log" Dec 09 11:56:59 crc kubenswrapper[4745]: E1209 11:56:59.973422 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e32e826d-c08c-4308-9015-696ed1413663" containerName="nova-api-api" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.973436 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e32e826d-c08c-4308-9015-696ed1413663" containerName="nova-api-api" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.976566 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e32e826d-c08c-4308-9015-696ed1413663" containerName="nova-api-log" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.976648 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e32e826d-c08c-4308-9015-696ed1413663" containerName="nova-api-api" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.979254 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.982609 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.982681 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 09 11:56:59 crc kubenswrapper[4745]: I1209 11:56:59.982560 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.004174 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.004233 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-config-data\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.004327 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-public-tls-certs\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.004361 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52wnm\" (UniqueName: \"kubernetes.io/projected/c9e009b8-8eec-4028-ade9-84bc49d236c8-kube-api-access-52wnm\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.004455 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e009b8-8eec-4028-ade9-84bc49d236c8-logs\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.004555 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.009269 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.107378 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e009b8-8eec-4028-ade9-84bc49d236c8-logs\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.107524 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.107567 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.107587 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-config-data\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.107669 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-public-tls-certs\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.107704 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52wnm\" (UniqueName: \"kubernetes.io/projected/c9e009b8-8eec-4028-ade9-84bc49d236c8-kube-api-access-52wnm\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.108486 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e009b8-8eec-4028-ade9-84bc49d236c8-logs\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.113899 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.120374 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-public-tls-certs\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.120692 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.127394 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-config-data\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.128187 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52wnm\" (UniqueName: \"kubernetes.io/projected/c9e009b8-8eec-4028-ade9-84bc49d236c8-kube-api-access-52wnm\") pod \"nova-api-0\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.316358 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.826045 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.891440 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9e009b8-8eec-4028-ade9-84bc49d236c8","Type":"ContainerStarted","Data":"7fc3a9b03537f8a84a8d51c00b0cac0413c631e73c2ee7cb3b249c44da01c4e7"} Dec 09 11:57:00 crc kubenswrapper[4745]: I1209 11:57:00.893826 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftdfr" event={"ID":"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043","Type":"ContainerStarted","Data":"b8014319e1b426576aa5826872a2e5d3f6bdc55954e6ebee617c7290df73a53e"} Dec 09 11:57:01 crc kubenswrapper[4745]: I1209 11:57:01.501455 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 11:57:01 crc kubenswrapper[4745]: I1209 11:57:01.502058 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 11:57:01 crc kubenswrapper[4745]: I1209 11:57:01.570244 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e32e826d-c08c-4308-9015-696ed1413663" path="/var/lib/kubelet/pods/e32e826d-c08c-4308-9015-696ed1413663/volumes" Dec 09 11:57:01 crc kubenswrapper[4745]: I1209 11:57:01.927610 4745 generic.go:334] "Generic (PLEG): container finished" podID="ad6a4ce9-ca82-4c63-bafe-706a3b2b5043" containerID="b8014319e1b426576aa5826872a2e5d3f6bdc55954e6ebee617c7290df73a53e" exitCode=0 Dec 09 11:57:01 crc kubenswrapper[4745]: I1209 11:57:01.927726 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftdfr" event={"ID":"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043","Type":"ContainerDied","Data":"b8014319e1b426576aa5826872a2e5d3f6bdc55954e6ebee617c7290df73a53e"} Dec 09 11:57:01 crc kubenswrapper[4745]: I1209 11:57:01.934920 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9e009b8-8eec-4028-ade9-84bc49d236c8","Type":"ContainerStarted","Data":"4d0d0764d9f0fe5167222401d27b58294f16f2e351cb70a946140ba51e127793"} Dec 09 11:57:01 crc kubenswrapper[4745]: I1209 11:57:01.935331 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9e009b8-8eec-4028-ade9-84bc49d236c8","Type":"ContainerStarted","Data":"0959a307a96a3c8b986b8df7cc54318c82c7451c29bafc311bb4eafa178c8154"} Dec 09 11:57:02 crc kubenswrapper[4745]: I1209 11:57:02.031299 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.031271579 podStartE2EDuration="3.031271579s" podCreationTimestamp="2025-12-09 11:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:57:02.01761065 +0000 UTC m=+1508.842812174" watchObservedRunningTime="2025-12-09 11:57:02.031271579 +0000 UTC m=+1508.856473103" Dec 09 11:57:02 crc kubenswrapper[4745]: I1209 11:57:02.949954 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftdfr" event={"ID":"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043","Type":"ContainerStarted","Data":"43f8fa5c7d52efe878ea1c003b0e7f45b3f6a72fc8721a17a2cdf61c978b071f"} Dec 09 11:57:02 crc kubenswrapper[4745]: I1209 11:57:02.977858 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ftdfr" podStartSLOduration=2.527804542 podStartE2EDuration="4.977835892s" podCreationTimestamp="2025-12-09 11:56:58 +0000 UTC" firstStartedPulling="2025-12-09 11:56:59.874544547 +0000 UTC m=+1506.699746071" lastFinishedPulling="2025-12-09 11:57:02.324575897 +0000 UTC m=+1509.149777421" observedRunningTime="2025-12-09 11:57:02.968909691 +0000 UTC m=+1509.794111235" watchObservedRunningTime="2025-12-09 11:57:02.977835892 +0000 UTC m=+1509.803037416" Dec 09 11:57:03 crc kubenswrapper[4745]: I1209 11:57:03.225201 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 11:57:06 crc kubenswrapper[4745]: I1209 11:57:06.501331 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 11:57:06 crc kubenswrapper[4745]: I1209 11:57:06.501988 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 11:57:07 crc kubenswrapper[4745]: I1209 11:57:07.552663 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c180eac0-e93f-4067-ba6d-32a023f424e6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 09 11:57:07 crc kubenswrapper[4745]: I1209 11:57:07.552738 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c180eac0-e93f-4067-ba6d-32a023f424e6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 09 11:57:08 crc kubenswrapper[4745]: I1209 11:57:08.225612 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 11:57:08 crc kubenswrapper[4745]: I1209 11:57:08.260584 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 11:57:08 crc kubenswrapper[4745]: I1209 11:57:08.777626 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:57:08 crc kubenswrapper[4745]: I1209 11:57:08.778153 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:57:08 crc kubenswrapper[4745]: I1209 11:57:08.841485 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:57:09 crc kubenswrapper[4745]: I1209 11:57:09.055888 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 11:57:09 crc kubenswrapper[4745]: I1209 11:57:09.079172 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:57:09 crc kubenswrapper[4745]: I1209 11:57:09.167050 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ftdfr"] Dec 09 11:57:10 crc kubenswrapper[4745]: I1209 11:57:10.316842 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 11:57:10 crc kubenswrapper[4745]: I1209 11:57:10.317424 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 11:57:11 crc kubenswrapper[4745]: I1209 11:57:11.053036 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ftdfr" podUID="ad6a4ce9-ca82-4c63-bafe-706a3b2b5043" containerName="registry-server" containerID="cri-o://43f8fa5c7d52efe878ea1c003b0e7f45b3f6a72fc8721a17a2cdf61c978b071f" gracePeriod=2 Dec 09 11:57:11 crc kubenswrapper[4745]: I1209 11:57:11.337743 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c9e009b8-8eec-4028-ade9-84bc49d236c8" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 09 11:57:11 crc kubenswrapper[4745]: I1209 11:57:11.337766 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c9e009b8-8eec-4028-ade9-84bc49d236c8" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 09 11:57:11 crc kubenswrapper[4745]: I1209 11:57:11.560222 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:57:11 crc kubenswrapper[4745]: I1209 11:57:11.622370 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-utilities\") pod \"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043\" (UID: \"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043\") " Dec 09 11:57:11 crc kubenswrapper[4745]: I1209 11:57:11.622607 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lxz2\" (UniqueName: \"kubernetes.io/projected/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-kube-api-access-5lxz2\") pod \"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043\" (UID: \"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043\") " Dec 09 11:57:11 crc kubenswrapper[4745]: I1209 11:57:11.622717 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-catalog-content\") pod \"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043\" (UID: \"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043\") " Dec 09 11:57:11 crc kubenswrapper[4745]: I1209 11:57:11.623561 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-utilities" (OuterVolumeSpecName: "utilities") pod "ad6a4ce9-ca82-4c63-bafe-706a3b2b5043" (UID: "ad6a4ce9-ca82-4c63-bafe-706a3b2b5043"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:11 crc kubenswrapper[4745]: I1209 11:57:11.644828 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-kube-api-access-5lxz2" (OuterVolumeSpecName: "kube-api-access-5lxz2") pod "ad6a4ce9-ca82-4c63-bafe-706a3b2b5043" (UID: "ad6a4ce9-ca82-4c63-bafe-706a3b2b5043"). InnerVolumeSpecName "kube-api-access-5lxz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:11 crc kubenswrapper[4745]: I1209 11:57:11.675436 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad6a4ce9-ca82-4c63-bafe-706a3b2b5043" (UID: "ad6a4ce9-ca82-4c63-bafe-706a3b2b5043"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:11 crc kubenswrapper[4745]: I1209 11:57:11.726344 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:11 crc kubenswrapper[4745]: I1209 11:57:11.726375 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lxz2\" (UniqueName: \"kubernetes.io/projected/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-kube-api-access-5lxz2\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:11 crc kubenswrapper[4745]: I1209 11:57:11.726409 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:12 crc kubenswrapper[4745]: I1209 11:57:12.036809 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 09 11:57:12 crc kubenswrapper[4745]: I1209 11:57:12.100489 4745 generic.go:334] "Generic (PLEG): container finished" podID="ad6a4ce9-ca82-4c63-bafe-706a3b2b5043" containerID="43f8fa5c7d52efe878ea1c003b0e7f45b3f6a72fc8721a17a2cdf61c978b071f" exitCode=0 Dec 09 11:57:12 crc kubenswrapper[4745]: I1209 11:57:12.100584 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftdfr" event={"ID":"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043","Type":"ContainerDied","Data":"43f8fa5c7d52efe878ea1c003b0e7f45b3f6a72fc8721a17a2cdf61c978b071f"} Dec 09 11:57:12 crc kubenswrapper[4745]: I1209 11:57:12.100623 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftdfr" event={"ID":"ad6a4ce9-ca82-4c63-bafe-706a3b2b5043","Type":"ContainerDied","Data":"ed8baf5961d1c1768ae104a1604b08c80d77bf5c93c1b70c27eacc559d15bcb7"} Dec 09 11:57:12 crc kubenswrapper[4745]: I1209 11:57:12.100644 4745 scope.go:117] "RemoveContainer" containerID="43f8fa5c7d52efe878ea1c003b0e7f45b3f6a72fc8721a17a2cdf61c978b071f" Dec 09 11:57:12 crc kubenswrapper[4745]: I1209 11:57:12.100851 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ftdfr" Dec 09 11:57:12 crc kubenswrapper[4745]: I1209 11:57:12.153867 4745 scope.go:117] "RemoveContainer" containerID="b8014319e1b426576aa5826872a2e5d3f6bdc55954e6ebee617c7290df73a53e" Dec 09 11:57:12 crc kubenswrapper[4745]: I1209 11:57:12.204131 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ftdfr"] Dec 09 11:57:12 crc kubenswrapper[4745]: I1209 11:57:12.217573 4745 scope.go:117] "RemoveContainer" containerID="da5511b3949270908589006e86fc679c4fcea45ad940ca9f4100818479e2db77" Dec 09 11:57:12 crc kubenswrapper[4745]: I1209 11:57:12.228630 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ftdfr"] Dec 09 11:57:12 crc kubenswrapper[4745]: I1209 11:57:12.267460 4745 scope.go:117] "RemoveContainer" containerID="43f8fa5c7d52efe878ea1c003b0e7f45b3f6a72fc8721a17a2cdf61c978b071f" Dec 09 11:57:12 crc kubenswrapper[4745]: E1209 11:57:12.268086 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43f8fa5c7d52efe878ea1c003b0e7f45b3f6a72fc8721a17a2cdf61c978b071f\": container with ID starting with 43f8fa5c7d52efe878ea1c003b0e7f45b3f6a72fc8721a17a2cdf61c978b071f not found: ID does not exist" containerID="43f8fa5c7d52efe878ea1c003b0e7f45b3f6a72fc8721a17a2cdf61c978b071f" Dec 09 11:57:12 crc kubenswrapper[4745]: I1209 11:57:12.268127 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43f8fa5c7d52efe878ea1c003b0e7f45b3f6a72fc8721a17a2cdf61c978b071f"} err="failed to get container status \"43f8fa5c7d52efe878ea1c003b0e7f45b3f6a72fc8721a17a2cdf61c978b071f\": rpc error: code = NotFound desc = could not find container \"43f8fa5c7d52efe878ea1c003b0e7f45b3f6a72fc8721a17a2cdf61c978b071f\": container with ID starting with 43f8fa5c7d52efe878ea1c003b0e7f45b3f6a72fc8721a17a2cdf61c978b071f not found: ID does not exist" Dec 09 11:57:12 crc kubenswrapper[4745]: I1209 11:57:12.268153 4745 scope.go:117] "RemoveContainer" containerID="b8014319e1b426576aa5826872a2e5d3f6bdc55954e6ebee617c7290df73a53e" Dec 09 11:57:12 crc kubenswrapper[4745]: E1209 11:57:12.268360 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8014319e1b426576aa5826872a2e5d3f6bdc55954e6ebee617c7290df73a53e\": container with ID starting with b8014319e1b426576aa5826872a2e5d3f6bdc55954e6ebee617c7290df73a53e not found: ID does not exist" containerID="b8014319e1b426576aa5826872a2e5d3f6bdc55954e6ebee617c7290df73a53e" Dec 09 11:57:12 crc kubenswrapper[4745]: I1209 11:57:12.268383 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8014319e1b426576aa5826872a2e5d3f6bdc55954e6ebee617c7290df73a53e"} err="failed to get container status \"b8014319e1b426576aa5826872a2e5d3f6bdc55954e6ebee617c7290df73a53e\": rpc error: code = NotFound desc = could not find container \"b8014319e1b426576aa5826872a2e5d3f6bdc55954e6ebee617c7290df73a53e\": container with ID starting with b8014319e1b426576aa5826872a2e5d3f6bdc55954e6ebee617c7290df73a53e not found: ID does not exist" Dec 09 11:57:12 crc kubenswrapper[4745]: I1209 11:57:12.268399 4745 scope.go:117] "RemoveContainer" containerID="da5511b3949270908589006e86fc679c4fcea45ad940ca9f4100818479e2db77" Dec 09 11:57:12 crc kubenswrapper[4745]: E1209 11:57:12.268723 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da5511b3949270908589006e86fc679c4fcea45ad940ca9f4100818479e2db77\": container with ID starting with da5511b3949270908589006e86fc679c4fcea45ad940ca9f4100818479e2db77 not found: ID does not exist" containerID="da5511b3949270908589006e86fc679c4fcea45ad940ca9f4100818479e2db77" Dec 09 11:57:12 crc kubenswrapper[4745]: I1209 11:57:12.268747 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da5511b3949270908589006e86fc679c4fcea45ad940ca9f4100818479e2db77"} err="failed to get container status \"da5511b3949270908589006e86fc679c4fcea45ad940ca9f4100818479e2db77\": rpc error: code = NotFound desc = could not find container \"da5511b3949270908589006e86fc679c4fcea45ad940ca9f4100818479e2db77\": container with ID starting with da5511b3949270908589006e86fc679c4fcea45ad940ca9f4100818479e2db77 not found: ID does not exist" Dec 09 11:57:13 crc kubenswrapper[4745]: I1209 11:57:13.568407 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad6a4ce9-ca82-4c63-bafe-706a3b2b5043" path="/var/lib/kubelet/pods/ad6a4ce9-ca82-4c63-bafe-706a3b2b5043/volumes" Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.509054 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-29vl7"] Dec 09 11:57:14 crc kubenswrapper[4745]: E1209 11:57:14.510807 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad6a4ce9-ca82-4c63-bafe-706a3b2b5043" containerName="extract-utilities" Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.510829 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad6a4ce9-ca82-4c63-bafe-706a3b2b5043" containerName="extract-utilities" Dec 09 11:57:14 crc kubenswrapper[4745]: E1209 11:57:14.510876 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad6a4ce9-ca82-4c63-bafe-706a3b2b5043" containerName="registry-server" Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.510886 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad6a4ce9-ca82-4c63-bafe-706a3b2b5043" containerName="registry-server" Dec 09 11:57:14 crc kubenswrapper[4745]: E1209 11:57:14.510974 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad6a4ce9-ca82-4c63-bafe-706a3b2b5043" containerName="extract-content" Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.510993 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad6a4ce9-ca82-4c63-bafe-706a3b2b5043" containerName="extract-content" Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.511553 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad6a4ce9-ca82-4c63-bafe-706a3b2b5043" containerName="registry-server" Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.514147 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.527709 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-29vl7"] Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.603331 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-utilities\") pod \"certified-operators-29vl7\" (UID: \"ad7c604c-8bc1-47b1-8856-07bd5edeb24f\") " pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.603433 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-catalog-content\") pod \"certified-operators-29vl7\" (UID: \"ad7c604c-8bc1-47b1-8856-07bd5edeb24f\") " pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.603543 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p8dj\" (UniqueName: \"kubernetes.io/projected/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-kube-api-access-5p8dj\") pod \"certified-operators-29vl7\" (UID: \"ad7c604c-8bc1-47b1-8856-07bd5edeb24f\") " pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.706084 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-utilities\") pod \"certified-operators-29vl7\" (UID: \"ad7c604c-8bc1-47b1-8856-07bd5edeb24f\") " pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.706160 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-catalog-content\") pod \"certified-operators-29vl7\" (UID: \"ad7c604c-8bc1-47b1-8856-07bd5edeb24f\") " pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.706217 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p8dj\" (UniqueName: \"kubernetes.io/projected/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-kube-api-access-5p8dj\") pod \"certified-operators-29vl7\" (UID: \"ad7c604c-8bc1-47b1-8856-07bd5edeb24f\") " pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.706830 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-catalog-content\") pod \"certified-operators-29vl7\" (UID: \"ad7c604c-8bc1-47b1-8856-07bd5edeb24f\") " pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.707747 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-utilities\") pod \"certified-operators-29vl7\" (UID: \"ad7c604c-8bc1-47b1-8856-07bd5edeb24f\") " pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.728260 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p8dj\" (UniqueName: \"kubernetes.io/projected/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-kube-api-access-5p8dj\") pod \"certified-operators-29vl7\" (UID: \"ad7c604c-8bc1-47b1-8856-07bd5edeb24f\") " pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:14 crc kubenswrapper[4745]: I1209 11:57:14.841254 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:15 crc kubenswrapper[4745]: I1209 11:57:15.451461 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-29vl7"] Dec 09 11:57:15 crc kubenswrapper[4745]: W1209 11:57:15.477194 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad7c604c_8bc1_47b1_8856_07bd5edeb24f.slice/crio-7e343c2262158e8143961fbcde206b246aea63e26ff34a3391795e8354fdd788 WatchSource:0}: Error finding container 7e343c2262158e8143961fbcde206b246aea63e26ff34a3391795e8354fdd788: Status 404 returned error can't find the container with id 7e343c2262158e8143961fbcde206b246aea63e26ff34a3391795e8354fdd788 Dec 09 11:57:16 crc kubenswrapper[4745]: I1209 11:57:16.166182 4745 generic.go:334] "Generic (PLEG): container finished" podID="ad7c604c-8bc1-47b1-8856-07bd5edeb24f" containerID="9cace66fb4eb49f8df9311c715cbda518e9baa118c07812d67999f5df57dd6ca" exitCode=0 Dec 09 11:57:16 crc kubenswrapper[4745]: I1209 11:57:16.166365 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29vl7" event={"ID":"ad7c604c-8bc1-47b1-8856-07bd5edeb24f","Type":"ContainerDied","Data":"9cace66fb4eb49f8df9311c715cbda518e9baa118c07812d67999f5df57dd6ca"} Dec 09 11:57:16 crc kubenswrapper[4745]: I1209 11:57:16.166832 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29vl7" event={"ID":"ad7c604c-8bc1-47b1-8856-07bd5edeb24f","Type":"ContainerStarted","Data":"7e343c2262158e8143961fbcde206b246aea63e26ff34a3391795e8354fdd788"} Dec 09 11:57:16 crc kubenswrapper[4745]: I1209 11:57:16.506841 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 11:57:16 crc kubenswrapper[4745]: I1209 11:57:16.508542 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 11:57:16 crc kubenswrapper[4745]: I1209 11:57:16.512340 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 11:57:17 crc kubenswrapper[4745]: I1209 11:57:17.181434 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29vl7" event={"ID":"ad7c604c-8bc1-47b1-8856-07bd5edeb24f","Type":"ContainerStarted","Data":"2506d9bb2e0fe664301886fb60d4ac28004bab06519bf43bd9ffca0d5cc28590"} Dec 09 11:57:17 crc kubenswrapper[4745]: I1209 11:57:17.190367 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 11:57:18 crc kubenswrapper[4745]: I1209 11:57:18.201416 4745 generic.go:334] "Generic (PLEG): container finished" podID="ad7c604c-8bc1-47b1-8856-07bd5edeb24f" containerID="2506d9bb2e0fe664301886fb60d4ac28004bab06519bf43bd9ffca0d5cc28590" exitCode=0 Dec 09 11:57:18 crc kubenswrapper[4745]: I1209 11:57:18.201564 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29vl7" event={"ID":"ad7c604c-8bc1-47b1-8856-07bd5edeb24f","Type":"ContainerDied","Data":"2506d9bb2e0fe664301886fb60d4ac28004bab06519bf43bd9ffca0d5cc28590"} Dec 09 11:57:19 crc kubenswrapper[4745]: I1209 11:57:19.227905 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29vl7" event={"ID":"ad7c604c-8bc1-47b1-8856-07bd5edeb24f","Type":"ContainerStarted","Data":"695550ec39a697e847c5e929393144e34dbd61f35570ff687dbad59e64829fa1"} Dec 09 11:57:19 crc kubenswrapper[4745]: I1209 11:57:19.262341 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-29vl7" podStartSLOduration=2.825844377 podStartE2EDuration="5.262315241s" podCreationTimestamp="2025-12-09 11:57:14 +0000 UTC" firstStartedPulling="2025-12-09 11:57:16.168976706 +0000 UTC m=+1522.994178230" lastFinishedPulling="2025-12-09 11:57:18.60544757 +0000 UTC m=+1525.430649094" observedRunningTime="2025-12-09 11:57:19.249469985 +0000 UTC m=+1526.074671519" watchObservedRunningTime="2025-12-09 11:57:19.262315241 +0000 UTC m=+1526.087516765" Dec 09 11:57:20 crc kubenswrapper[4745]: I1209 11:57:20.325231 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 11:57:20 crc kubenswrapper[4745]: I1209 11:57:20.325963 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 11:57:20 crc kubenswrapper[4745]: I1209 11:57:20.327003 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 11:57:20 crc kubenswrapper[4745]: I1209 11:57:20.327523 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 11:57:20 crc kubenswrapper[4745]: I1209 11:57:20.334203 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 11:57:20 crc kubenswrapper[4745]: I1209 11:57:20.337241 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 11:57:24 crc kubenswrapper[4745]: I1209 11:57:24.841574 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:24 crc kubenswrapper[4745]: I1209 11:57:24.842814 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:24 crc kubenswrapper[4745]: I1209 11:57:24.920486 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:25 crc kubenswrapper[4745]: I1209 11:57:25.336783 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:25 crc kubenswrapper[4745]: I1209 11:57:25.391160 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-29vl7"] Dec 09 11:57:25 crc kubenswrapper[4745]: I1209 11:57:25.475862 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 11:57:25 crc kubenswrapper[4745]: I1209 11:57:25.475938 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 11:57:25 crc kubenswrapper[4745]: I1209 11:57:25.475998 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 11:57:25 crc kubenswrapper[4745]: I1209 11:57:25.477093 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 11:57:25 crc kubenswrapper[4745]: I1209 11:57:25.477166 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" gracePeriod=600 Dec 09 11:57:25 crc kubenswrapper[4745]: E1209 11:57:25.605258 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 11:57:26 crc kubenswrapper[4745]: I1209 11:57:26.301756 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" exitCode=0 Dec 09 11:57:26 crc kubenswrapper[4745]: I1209 11:57:26.301864 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3"} Dec 09 11:57:26 crc kubenswrapper[4745]: I1209 11:57:26.302356 4745 scope.go:117] "RemoveContainer" containerID="254488b12cfc6e65d01192f108f9d8847d5257e1f8c39a968b3046b52ec176b8" Dec 09 11:57:26 crc kubenswrapper[4745]: I1209 11:57:26.303901 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 11:57:26 crc kubenswrapper[4745]: E1209 11:57:26.305163 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 11:57:27 crc kubenswrapper[4745]: I1209 11:57:27.316871 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-29vl7" podUID="ad7c604c-8bc1-47b1-8856-07bd5edeb24f" containerName="registry-server" containerID="cri-o://695550ec39a697e847c5e929393144e34dbd61f35570ff687dbad59e64829fa1" gracePeriod=2 Dec 09 11:57:28 crc kubenswrapper[4745]: I1209 11:57:28.332133 4745 generic.go:334] "Generic (PLEG): container finished" podID="ad7c604c-8bc1-47b1-8856-07bd5edeb24f" containerID="695550ec39a697e847c5e929393144e34dbd61f35570ff687dbad59e64829fa1" exitCode=0 Dec 09 11:57:28 crc kubenswrapper[4745]: I1209 11:57:28.332200 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29vl7" event={"ID":"ad7c604c-8bc1-47b1-8856-07bd5edeb24f","Type":"ContainerDied","Data":"695550ec39a697e847c5e929393144e34dbd61f35570ff687dbad59e64829fa1"} Dec 09 11:57:28 crc kubenswrapper[4745]: I1209 11:57:28.924318 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:28 crc kubenswrapper[4745]: I1209 11:57:28.979372 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-utilities\") pod \"ad7c604c-8bc1-47b1-8856-07bd5edeb24f\" (UID: \"ad7c604c-8bc1-47b1-8856-07bd5edeb24f\") " Dec 09 11:57:28 crc kubenswrapper[4745]: I1209 11:57:28.979673 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-catalog-content\") pod \"ad7c604c-8bc1-47b1-8856-07bd5edeb24f\" (UID: \"ad7c604c-8bc1-47b1-8856-07bd5edeb24f\") " Dec 09 11:57:28 crc kubenswrapper[4745]: I1209 11:57:28.979775 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p8dj\" (UniqueName: \"kubernetes.io/projected/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-kube-api-access-5p8dj\") pod \"ad7c604c-8bc1-47b1-8856-07bd5edeb24f\" (UID: \"ad7c604c-8bc1-47b1-8856-07bd5edeb24f\") " Dec 09 11:57:28 crc kubenswrapper[4745]: I1209 11:57:28.980650 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-utilities" (OuterVolumeSpecName: "utilities") pod "ad7c604c-8bc1-47b1-8856-07bd5edeb24f" (UID: "ad7c604c-8bc1-47b1-8856-07bd5edeb24f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:28 crc kubenswrapper[4745]: I1209 11:57:28.988860 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-kube-api-access-5p8dj" (OuterVolumeSpecName: "kube-api-access-5p8dj") pod "ad7c604c-8bc1-47b1-8856-07bd5edeb24f" (UID: "ad7c604c-8bc1-47b1-8856-07bd5edeb24f"). InnerVolumeSpecName "kube-api-access-5p8dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:29 crc kubenswrapper[4745]: I1209 11:57:29.039491 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad7c604c-8bc1-47b1-8856-07bd5edeb24f" (UID: "ad7c604c-8bc1-47b1-8856-07bd5edeb24f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:29 crc kubenswrapper[4745]: I1209 11:57:29.083460 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:29 crc kubenswrapper[4745]: I1209 11:57:29.083551 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p8dj\" (UniqueName: \"kubernetes.io/projected/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-kube-api-access-5p8dj\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:29 crc kubenswrapper[4745]: I1209 11:57:29.083571 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7c604c-8bc1-47b1-8856-07bd5edeb24f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:29 crc kubenswrapper[4745]: I1209 11:57:29.349536 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29vl7" event={"ID":"ad7c604c-8bc1-47b1-8856-07bd5edeb24f","Type":"ContainerDied","Data":"7e343c2262158e8143961fbcde206b246aea63e26ff34a3391795e8354fdd788"} Dec 09 11:57:29 crc kubenswrapper[4745]: I1209 11:57:29.349624 4745 scope.go:117] "RemoveContainer" containerID="695550ec39a697e847c5e929393144e34dbd61f35570ff687dbad59e64829fa1" Dec 09 11:57:29 crc kubenswrapper[4745]: I1209 11:57:29.349620 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29vl7" Dec 09 11:57:29 crc kubenswrapper[4745]: I1209 11:57:29.375527 4745 scope.go:117] "RemoveContainer" containerID="2506d9bb2e0fe664301886fb60d4ac28004bab06519bf43bd9ffca0d5cc28590" Dec 09 11:57:29 crc kubenswrapper[4745]: I1209 11:57:29.399171 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-29vl7"] Dec 09 11:57:29 crc kubenswrapper[4745]: I1209 11:57:29.403882 4745 scope.go:117] "RemoveContainer" containerID="9cace66fb4eb49f8df9311c715cbda518e9baa118c07812d67999f5df57dd6ca" Dec 09 11:57:29 crc kubenswrapper[4745]: I1209 11:57:29.411274 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-29vl7"] Dec 09 11:57:29 crc kubenswrapper[4745]: I1209 11:57:29.567443 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad7c604c-8bc1-47b1-8856-07bd5edeb24f" path="/var/lib/kubelet/pods/ad7c604c-8bc1-47b1-8856-07bd5edeb24f/volumes" Dec 09 11:57:40 crc kubenswrapper[4745]: I1209 11:57:40.555680 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 11:57:40 crc kubenswrapper[4745]: E1209 11:57:40.556908 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 11:57:42 crc kubenswrapper[4745]: I1209 11:57:42.637036 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 09 11:57:42 crc kubenswrapper[4745]: I1209 11:57:42.637637 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="ac238b52-4167-4847-b66f-6985b784268c" containerName="openstackclient" containerID="cri-o://dc6f10acb25d86858fe8f0ee4fbcf99c84e675bea1bcdd1d53f89781a1482c42" gracePeriod=2 Dec 09 11:57:42 crc kubenswrapper[4745]: I1209 11:57:42.691685 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 09 11:57:42 crc kubenswrapper[4745]: I1209 11:57:42.745640 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 11:57:42 crc kubenswrapper[4745]: I1209 11:57:42.746442 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d83c2d24-3c0c-4097-afd8-1649e08665e4" containerName="openstack-network-exporter" containerID="cri-o://82e76a7787b8495afefaced3bf46a59645fcc2228c694a5e7e6a738e92d9f044" gracePeriod=300 Dec 09 11:57:42 crc kubenswrapper[4745]: I1209 11:57:42.808925 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 11:57:42 crc kubenswrapper[4745]: E1209 11:57:42.932372 4745 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 09 11:57:42 crc kubenswrapper[4745]: E1209 11:57:42.932478 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-config-data podName:ceba626e-26d1-495f-b88d-fed69e445ddb nodeName:}" failed. No retries permitted until 2025-12-09 11:57:43.432454033 +0000 UTC m=+1550.257655737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-config-data") pod "rabbitmq-cell1-server-0" (UID: "ceba626e-26d1-495f-b88d-fed69e445ddb") : configmap "rabbitmq-cell1-config-data" not found Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:42.994915 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.006016 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="8b9f661d-4261-4e54-883d-cb0e7479a3d2" containerName="ovn-northd" containerID="cri-o://f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c" gracePeriod=30 Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.006679 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="8b9f661d-4261-4e54-883d-cb0e7479a3d2" containerName="openstack-network-exporter" containerID="cri-o://ef71441cc3ee46bc1ee03800d7cc412e88722d8e33ae675352d43ada5e7b3e84" gracePeriod=30 Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.023466 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron1fc4-account-delete-rqj56"] Dec 09 11:57:43 crc kubenswrapper[4745]: E1209 11:57:43.028008 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7c604c-8bc1-47b1-8856-07bd5edeb24f" containerName="extract-utilities" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.028048 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7c604c-8bc1-47b1-8856-07bd5edeb24f" containerName="extract-utilities" Dec 09 11:57:43 crc kubenswrapper[4745]: E1209 11:57:43.028068 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7c604c-8bc1-47b1-8856-07bd5edeb24f" containerName="extract-content" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.028076 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7c604c-8bc1-47b1-8856-07bd5edeb24f" containerName="extract-content" Dec 09 11:57:43 crc kubenswrapper[4745]: E1209 11:57:43.028117 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac238b52-4167-4847-b66f-6985b784268c" containerName="openstackclient" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.028123 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac238b52-4167-4847-b66f-6985b784268c" containerName="openstackclient" Dec 09 11:57:43 crc kubenswrapper[4745]: E1209 11:57:43.028132 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7c604c-8bc1-47b1-8856-07bd5edeb24f" containerName="registry-server" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.028138 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7c604c-8bc1-47b1-8856-07bd5edeb24f" containerName="registry-server" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.028399 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad7c604c-8bc1-47b1-8856-07bd5edeb24f" containerName="registry-server" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.028425 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac238b52-4167-4847-b66f-6985b784268c" containerName="openstackclient" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.029391 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron1fc4-account-delete-rqj56" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.031476 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g74lk\" (UniqueName: \"kubernetes.io/projected/f8d65ea6-5ea0-44fa-a4ab-82297d975a87-kube-api-access-g74lk\") pod \"neutron1fc4-account-delete-rqj56\" (UID: \"f8d65ea6-5ea0-44fa-a4ab-82297d975a87\") " pod="openstack/neutron1fc4-account-delete-rqj56" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.031642 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d65ea6-5ea0-44fa-a4ab-82297d975a87-operator-scripts\") pod \"neutron1fc4-account-delete-rqj56\" (UID: \"f8d65ea6-5ea0-44fa-a4ab-82297d975a87\") " pod="openstack/neutron1fc4-account-delete-rqj56" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.126968 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement980e-account-delete-85kkl"] Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.128879 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement980e-account-delete-85kkl" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.134934 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d65ea6-5ea0-44fa-a4ab-82297d975a87-operator-scripts\") pod \"neutron1fc4-account-delete-rqj56\" (UID: \"f8d65ea6-5ea0-44fa-a4ab-82297d975a87\") " pod="openstack/neutron1fc4-account-delete-rqj56" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.135056 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g74lk\" (UniqueName: \"kubernetes.io/projected/f8d65ea6-5ea0-44fa-a4ab-82297d975a87-kube-api-access-g74lk\") pod \"neutron1fc4-account-delete-rqj56\" (UID: \"f8d65ea6-5ea0-44fa-a4ab-82297d975a87\") " pod="openstack/neutron1fc4-account-delete-rqj56" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.135950 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d65ea6-5ea0-44fa-a4ab-82297d975a87-operator-scripts\") pod \"neutron1fc4-account-delete-rqj56\" (UID: \"f8d65ea6-5ea0-44fa-a4ab-82297d975a87\") " pod="openstack/neutron1fc4-account-delete-rqj56" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.164938 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.200205 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement980e-account-delete-85kkl"] Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.237825 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g74lk\" (UniqueName: \"kubernetes.io/projected/f8d65ea6-5ea0-44fa-a4ab-82297d975a87-kube-api-access-g74lk\") pod \"neutron1fc4-account-delete-rqj56\" (UID: \"f8d65ea6-5ea0-44fa-a4ab-82297d975a87\") " pod="openstack/neutron1fc4-account-delete-rqj56" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.264312 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxpjs\" (UniqueName: \"kubernetes.io/projected/058e6f79-b92b-47d3-97ae-d588fec5efcd-kube-api-access-kxpjs\") pod \"placement980e-account-delete-85kkl\" (UID: \"058e6f79-b92b-47d3-97ae-d588fec5efcd\") " pod="openstack/placement980e-account-delete-85kkl" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.264921 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/058e6f79-b92b-47d3-97ae-d588fec5efcd-operator-scripts\") pod \"placement980e-account-delete-85kkl\" (UID: \"058e6f79-b92b-47d3-97ae-d588fec5efcd\") " pod="openstack/placement980e-account-delete-85kkl" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.315655 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron1fc4-account-delete-rqj56"] Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.340427 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d83c2d24-3c0c-4097-afd8-1649e08665e4" containerName="ovsdbserver-nb" containerID="cri-o://bff23da6e6763af0c2998317017091c2a8ad2169bcc47bd632c82d8992462e86" gracePeriod=300 Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.340724 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dqr9b"] Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.368556 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron1fc4-account-delete-rqj56" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.369719 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/058e6f79-b92b-47d3-97ae-d588fec5efcd-operator-scripts\") pod \"placement980e-account-delete-85kkl\" (UID: \"058e6f79-b92b-47d3-97ae-d588fec5efcd\") " pod="openstack/placement980e-account-delete-85kkl" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.369833 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxpjs\" (UniqueName: \"kubernetes.io/projected/058e6f79-b92b-47d3-97ae-d588fec5efcd-kube-api-access-kxpjs\") pod \"placement980e-account-delete-85kkl\" (UID: \"058e6f79-b92b-47d3-97ae-d588fec5efcd\") " pod="openstack/placement980e-account-delete-85kkl" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.370949 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/058e6f79-b92b-47d3-97ae-d588fec5efcd-operator-scripts\") pod \"placement980e-account-delete-85kkl\" (UID: \"058e6f79-b92b-47d3-97ae-d588fec5efcd\") " pod="openstack/placement980e-account-delete-85kkl" Dec 09 11:57:43 crc kubenswrapper[4745]: E1209 11:57:43.371821 4745 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 09 11:57:43 crc kubenswrapper[4745]: E1209 11:57:43.371892 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-config-data podName:5b860955-30eb-40e6-bd56-caf6098aed8a nodeName:}" failed. No retries permitted until 2025-12-09 11:57:43.87186952 +0000 UTC m=+1550.697071044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-config-data") pod "rabbitmq-server-0" (UID: "5b860955-30eb-40e6-bd56-caf6098aed8a") : configmap "rabbitmq-config-data" not found Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.430905 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-zdn2w"] Dec 09 11:57:43 crc kubenswrapper[4745]: E1209 11:57:43.472728 4745 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 09 11:57:43 crc kubenswrapper[4745]: E1209 11:57:43.472820 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-config-data podName:ceba626e-26d1-495f-b88d-fed69e445ddb nodeName:}" failed. No retries permitted until 2025-12-09 11:57:44.472794382 +0000 UTC m=+1551.297995906 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-config-data") pod "rabbitmq-cell1-server-0" (UID: "ceba626e-26d1-495f-b88d-fed69e445ddb") : configmap "rabbitmq-cell1-config-data" not found Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.488474 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxpjs\" (UniqueName: \"kubernetes.io/projected/058e6f79-b92b-47d3-97ae-d588fec5efcd-kube-api-access-kxpjs\") pod \"placement980e-account-delete-85kkl\" (UID: \"058e6f79-b92b-47d3-97ae-d588fec5efcd\") " pod="openstack/placement980e-account-delete-85kkl" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.503128 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder686d-account-delete-s68kh"] Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.504880 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder686d-account-delete-s68kh" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.514651 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement980e-account-delete-85kkl" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.573906 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ca899aa-fd10-43e0-ae58-a3a4ae2a4066-operator-scripts\") pod \"cinder686d-account-delete-s68kh\" (UID: \"8ca899aa-fd10-43e0-ae58-a3a4ae2a4066\") " pod="openstack/cinder686d-account-delete-s68kh" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.573968 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6qx5\" (UniqueName: \"kubernetes.io/projected/8ca899aa-fd10-43e0-ae58-a3a4ae2a4066-kube-api-access-k6qx5\") pod \"cinder686d-account-delete-s68kh\" (UID: \"8ca899aa-fd10-43e0-ae58-a3a4ae2a4066\") " pod="openstack/cinder686d-account-delete-s68kh" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.604458 4745 generic.go:334] "Generic (PLEG): container finished" podID="8b9f661d-4261-4e54-883d-cb0e7479a3d2" containerID="ef71441cc3ee46bc1ee03800d7cc412e88722d8e33ae675352d43ada5e7b3e84" exitCode=2 Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.695610 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ca899aa-fd10-43e0-ae58-a3a4ae2a4066-operator-scripts\") pod \"cinder686d-account-delete-s68kh\" (UID: \"8ca899aa-fd10-43e0-ae58-a3a4ae2a4066\") " pod="openstack/cinder686d-account-delete-s68kh" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.700123 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6qx5\" (UniqueName: \"kubernetes.io/projected/8ca899aa-fd10-43e0-ae58-a3a4ae2a4066-kube-api-access-k6qx5\") pod \"cinder686d-account-delete-s68kh\" (UID: \"8ca899aa-fd10-43e0-ae58-a3a4ae2a4066\") " pod="openstack/cinder686d-account-delete-s68kh" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.699053 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ca899aa-fd10-43e0-ae58-a3a4ae2a4066-operator-scripts\") pod \"cinder686d-account-delete-s68kh\" (UID: \"8ca899aa-fd10-43e0-ae58-a3a4ae2a4066\") " pod="openstack/cinder686d-account-delete-s68kh" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.767324 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8b9f661d-4261-4e54-883d-cb0e7479a3d2","Type":"ContainerDied","Data":"ef71441cc3ee46bc1ee03800d7cc412e88722d8e33ae675352d43ada5e7b3e84"} Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.767392 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-dw6lx"] Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.767714 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-dw6lx" podUID="52964bb1-2d93-4df7-afbc-95f1eb10b8fc" containerName="openstack-network-exporter" containerID="cri-o://6c0e76c55419e28835b5addc2baf871683786704476377eadb9a576f9a3ff72e" gracePeriod=30 Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.926258 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6qx5\" (UniqueName: \"kubernetes.io/projected/8ca899aa-fd10-43e0-ae58-a3a4ae2a4066-kube-api-access-k6qx5\") pod \"cinder686d-account-delete-s68kh\" (UID: \"8ca899aa-fd10-43e0-ae58-a3a4ae2a4066\") " pod="openstack/cinder686d-account-delete-s68kh" Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.949693 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder686d-account-delete-s68kh"] Dec 09 11:57:43 crc kubenswrapper[4745]: I1209 11:57:43.994206 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-mj74p"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.043073 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-mj74p"] Dec 09 11:57:44 crc kubenswrapper[4745]: E1209 11:57:44.061108 4745 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 09 11:57:44 crc kubenswrapper[4745]: E1209 11:57:44.061205 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-config-data podName:5b860955-30eb-40e6-bd56-caf6098aed8a nodeName:}" failed. No retries permitted until 2025-12-09 11:57:45.061180636 +0000 UTC m=+1551.886382160 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-config-data") pod "rabbitmq-server-0" (UID: "5b860955-30eb-40e6-bd56-caf6098aed8a") : configmap "rabbitmq-config-data" not found Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.088625 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-9zn5l"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.187595 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-n7q5f"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.200916 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder686d-account-delete-s68kh" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.220490 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-n7q5f"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.238443 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-9zn5l"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.252624 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance4957-account-delete-b6ngs"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.255039 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance4957-account-delete-b6ngs" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.286240 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m49pp\" (UniqueName: \"kubernetes.io/projected/1b87df09-ea26-4c97-bcd6-4ee7c6250d00-kube-api-access-m49pp\") pod \"glance4957-account-delete-b6ngs\" (UID: \"1b87df09-ea26-4c97-bcd6-4ee7c6250d00\") " pod="openstack/glance4957-account-delete-b6ngs" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.286307 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b87df09-ea26-4c97-bcd6-4ee7c6250d00-operator-scripts\") pod \"glance4957-account-delete-b6ngs\" (UID: \"1b87df09-ea26-4c97-bcd6-4ee7c6250d00\") " pod="openstack/glance4957-account-delete-b6ngs" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.291977 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance4957-account-delete-b6ngs"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.376066 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-pw7nc"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.425985 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m49pp\" (UniqueName: \"kubernetes.io/projected/1b87df09-ea26-4c97-bcd6-4ee7c6250d00-kube-api-access-m49pp\") pod \"glance4957-account-delete-b6ngs\" (UID: \"1b87df09-ea26-4c97-bcd6-4ee7c6250d00\") " pod="openstack/glance4957-account-delete-b6ngs" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.426044 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b87df09-ea26-4c97-bcd6-4ee7c6250d00-operator-scripts\") pod \"glance4957-account-delete-b6ngs\" (UID: \"1b87df09-ea26-4c97-bcd6-4ee7c6250d00\") " pod="openstack/glance4957-account-delete-b6ngs" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.426612 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-pw7nc"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.427733 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b87df09-ea26-4c97-bcd6-4ee7c6250d00-operator-scripts\") pod \"glance4957-account-delete-b6ngs\" (UID: \"1b87df09-ea26-4c97-bcd6-4ee7c6250d00\") " pod="openstack/glance4957-account-delete-b6ngs" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.440955 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-867cd545c7-gxt6p"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.441336 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" podUID="2f843182-a85c-47cf-ba16-414c40a031c5" containerName="dnsmasq-dns" containerID="cri-o://543eb1f623aed841beae952f8e46e93a8e7442d03c588d9a15fad9c843e15cc2" gracePeriod=10 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.456398 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-jmljq"] Dec 09 11:57:44 crc kubenswrapper[4745]: E1209 11:57:44.463550 4745 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-dqr9b" message=< Dec 09 11:57:44 crc kubenswrapper[4745]: Exiting ovn-controller (1) [ OK ] Dec 09 11:57:44 crc kubenswrapper[4745]: > Dec 09 11:57:44 crc kubenswrapper[4745]: E1209 11:57:44.464103 4745 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-dqr9b" podUID="dcea58c8-1e21-4581-bcdb-7b1f88e8b463" containerName="ovn-controller" containerID="cri-o://fe72087dc101255fbdaec3e160e931ac8185f38e75fad2f047784e8531b33949" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.464147 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-dqr9b" podUID="dcea58c8-1e21-4581-bcdb-7b1f88e8b463" containerName="ovn-controller" containerID="cri-o://fe72087dc101255fbdaec3e160e931ac8185f38e75fad2f047784e8531b33949" gracePeriod=29 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.473980 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-jmljq"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.479974 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m49pp\" (UniqueName: \"kubernetes.io/projected/1b87df09-ea26-4c97-bcd6-4ee7c6250d00-kube-api-access-m49pp\") pod \"glance4957-account-delete-b6ngs\" (UID: \"1b87df09-ea26-4c97-bcd6-4ee7c6250d00\") " pod="openstack/glance4957-account-delete-b6ngs" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.490682 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican83f7-account-delete-5vbd2"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.509196 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.509540 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican83f7-account-delete-5vbd2" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.509712 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="8e18fbbd-a304-422e-8c13-88ab08fef424" containerName="openstack-network-exporter" containerID="cri-o://f5ff7ad1034757488e9fa6ef3e4534c7b33ad6d4824c61d216c9946583e12257" gracePeriod=300 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.533906 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican83f7-account-delete-5vbd2"] Dec 09 11:57:44 crc kubenswrapper[4745]: E1209 11:57:44.536792 4745 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 09 11:57:44 crc kubenswrapper[4745]: E1209 11:57:44.536857 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-config-data podName:ceba626e-26d1-495f-b88d-fed69e445ddb nodeName:}" failed. No retries permitted until 2025-12-09 11:57:46.536840082 +0000 UTC m=+1553.362041606 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-config-data") pod "rabbitmq-cell1-server-0" (UID: "ceba626e-26d1-495f-b88d-fed69e445ddb") : configmap "rabbitmq-cell1-config-data" not found Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.541961 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b68fbfd5-bx5sx"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.542372 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b68fbfd5-bx5sx" podUID="f069021c-4758-4a29-98a5-2952a693cef9" containerName="neutron-api" containerID="cri-o://366b1b96d54298d8ed7a757e883bb67d2c254c7379c342add2a12b68039fea8b" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.544690 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b68fbfd5-bx5sx" podUID="f069021c-4758-4a29-98a5-2952a693cef9" containerName="neutron-httpd" containerID="cri-o://3c97425c86c2cd129c5f56303b9b231be4792489fd433e4c00bc829ef51d297c" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.594148 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-n77pg"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.634273 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7db6b497c6-z9vtr"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.634680 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7db6b497c6-z9vtr" podUID="25eecd8f-8b17-4e74-b651-78948c627127" containerName="placement-log" containerID="cri-o://c1072a6b7c6b1a8d03366bc7d0ddbbd40587331a7fd99d03c872e65df3f9946a" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.635390 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7db6b497c6-z9vtr" podUID="25eecd8f-8b17-4e74-b651-78948c627127" containerName="placement-api" containerID="cri-o://7fbc4fcdff7364e87c5838ace4d2c29c49b120562cd1441f9f1df819403a2bde" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.637324 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance4957-account-delete-b6ngs" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.642194 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bwb6\" (UniqueName: \"kubernetes.io/projected/88802adc-d164-420b-98d2-a757b6627350-kube-api-access-7bwb6\") pod \"barbican83f7-account-delete-5vbd2\" (UID: \"88802adc-d164-420b-98d2-a757b6627350\") " pod="openstack/barbican83f7-account-delete-5vbd2" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.642308 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88802adc-d164-420b-98d2-a757b6627350-operator-scripts\") pod \"barbican83f7-account-delete-5vbd2\" (UID: \"88802adc-d164-420b-98d2-a757b6627350\") " pod="openstack/barbican83f7-account-delete-5vbd2" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.695043 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-n77pg"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.741614 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell1d49f-account-delete-kt6sb"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.751956 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bwb6\" (UniqueName: \"kubernetes.io/projected/88802adc-d164-420b-98d2-a757b6627350-kube-api-access-7bwb6\") pod \"barbican83f7-account-delete-5vbd2\" (UID: \"88802adc-d164-420b-98d2-a757b6627350\") " pod="openstack/barbican83f7-account-delete-5vbd2" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.752037 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88802adc-d164-420b-98d2-a757b6627350-operator-scripts\") pod \"barbican83f7-account-delete-5vbd2\" (UID: \"88802adc-d164-420b-98d2-a757b6627350\") " pod="openstack/barbican83f7-account-delete-5vbd2" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.756976 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1d49f-account-delete-kt6sb" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.769407 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88802adc-d164-420b-98d2-a757b6627350-operator-scripts\") pod \"barbican83f7-account-delete-5vbd2\" (UID: \"88802adc-d164-420b-98d2-a757b6627350\") " pod="openstack/barbican83f7-account-delete-5vbd2" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.778063 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d83c2d24-3c0c-4097-afd8-1649e08665e4/ovsdbserver-nb/0.log" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.778125 4745 generic.go:334] "Generic (PLEG): container finished" podID="d83c2d24-3c0c-4097-afd8-1649e08665e4" containerID="82e76a7787b8495afefaced3bf46a59645fcc2228c694a5e7e6a738e92d9f044" exitCode=2 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.778151 4745 generic.go:334] "Generic (PLEG): container finished" podID="d83c2d24-3c0c-4097-afd8-1649e08665e4" containerID="bff23da6e6763af0c2998317017091c2a8ad2169bcc47bd632c82d8992462e86" exitCode=143 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.778203 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d83c2d24-3c0c-4097-afd8-1649e08665e4","Type":"ContainerDied","Data":"82e76a7787b8495afefaced3bf46a59645fcc2228c694a5e7e6a738e92d9f044"} Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.778238 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d83c2d24-3c0c-4097-afd8-1649e08665e4","Type":"ContainerDied","Data":"bff23da6e6763af0c2998317017091c2a8ad2169bcc47bd632c82d8992462e86"} Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.812927 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-dhdw4"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.826032 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bwb6\" (UniqueName: \"kubernetes.io/projected/88802adc-d164-420b-98d2-a757b6627350-kube-api-access-7bwb6\") pod \"barbican83f7-account-delete-5vbd2\" (UID: \"88802adc-d164-420b-98d2-a757b6627350\") " pod="openstack/barbican83f7-account-delete-5vbd2" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.842313 4745 generic.go:334] "Generic (PLEG): container finished" podID="dcea58c8-1e21-4581-bcdb-7b1f88e8b463" containerID="fe72087dc101255fbdaec3e160e931ac8185f38e75fad2f047784e8531b33949" exitCode=0 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.842409 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dqr9b" event={"ID":"dcea58c8-1e21-4581-bcdb-7b1f88e8b463","Type":"ContainerDied","Data":"fe72087dc101255fbdaec3e160e931ac8185f38e75fad2f047784e8531b33949"} Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.856339 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llr9r\" (UniqueName: \"kubernetes.io/projected/c305bbe8-a238-40ff-9b8f-2d075cb528e8-kube-api-access-llr9r\") pod \"novacell1d49f-account-delete-kt6sb\" (UID: \"c305bbe8-a238-40ff-9b8f-2d075cb528e8\") " pod="openstack/novacell1d49f-account-delete-kt6sb" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.868801 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c305bbe8-a238-40ff-9b8f-2d075cb528e8-operator-scripts\") pod \"novacell1d49f-account-delete-kt6sb\" (UID: \"c305bbe8-a238-40ff-9b8f-2d075cb528e8\") " pod="openstack/novacell1d49f-account-delete-kt6sb" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.887725 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-r7bpq"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.890859 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="8e18fbbd-a304-422e-8c13-88ab08fef424" containerName="ovsdbserver-sb" containerID="cri-o://40aa5562bd3f9aae5ac913d05ca874ce263e3c959881973ff50135ec376a86f0" gracePeriod=300 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.891300 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-dw6lx_52964bb1-2d93-4df7-afbc-95f1eb10b8fc/openstack-network-exporter/0.log" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.891339 4745 generic.go:334] "Generic (PLEG): container finished" podID="52964bb1-2d93-4df7-afbc-95f1eb10b8fc" containerID="6c0e76c55419e28835b5addc2baf871683786704476377eadb9a576f9a3ff72e" exitCode=2 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.891362 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-dw6lx" event={"ID":"52964bb1-2d93-4df7-afbc-95f1eb10b8fc","Type":"ContainerDied","Data":"6c0e76c55419e28835b5addc2baf871683786704476377eadb9a576f9a3ff72e"} Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.956718 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.957224 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="account-server" containerID="cri-o://949bb722de609413bb37a922ca3be73fb6c7a4916d0fe96c8b8b977d069ac462" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.957465 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-server" containerID="cri-o://452aca3b6e3787e1a6aba6c354db611a2345a39e27b0e120f6ddf7586fe81934" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.957559 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="container-updater" containerID="cri-o://c4b12f433424215b3190a84a7a70334fc6b3243836fe4b1b5a6c2aecc6c4f26b" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.957635 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="container-auditor" containerID="cri-o://affba02167b7f1a39157024b3315fc6007c26612f881997d6a5c2ee7f8c3a031" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.957699 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="container-replicator" containerID="cri-o://d1c5baae9ca212385d6390766968bb5c5c817b622698a6ebd63ce2e07c088b11" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.957744 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="container-server" containerID="cri-o://159077530d8b2bbf20cfe2054dbf825f2d68d49f59c36639bd872289904ce3f6" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.957775 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="account-reaper" containerID="cri-o://897333cc8f92ee902c0c7c820dc78dbe5dd0956289175db6c2e6cfd0c73135bd" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.957804 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="account-auditor" containerID="cri-o://864f05e319634c7f2e50a7ad6d5981f327821fb1cfb3fd4844b58c7d6634eebb" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.957839 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="account-replicator" containerID="cri-o://9331827add7fc120aedf6b88798a58ce524ffc136ae60e9e6ba61e68c91d8281" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.957888 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-updater" containerID="cri-o://3b09c85bd7e17fb217f98509b4b794244fe5e3f0c26306c7c690e6a051489972" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.957923 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="swift-recon-cron" containerID="cri-o://da72e5c3801efd33947357e2fd984868e5b4f23df51fbc04f57825d0c946a003" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.957952 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="rsync" containerID="cri-o://2bef905b24c86bdd149cc3b8f821b124b807b8515c256b0a8c7c6263b9eedcd7" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.957982 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-expirer" containerID="cri-o://a09eef6ded4878a8e32d6ba9cbd3dd19b6b7bfeec260ad7cfb501f71b11f4620" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.958028 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-replicator" containerID="cri-o://7b15301db669e893c5e60abe636eaa9c44f3893adf6b79d1697d2915c154c463" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.958071 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-auditor" containerID="cri-o://efabc77007bbbd62d845864253d28ab16751a3875bb948814d9433ce5750dc12" gracePeriod=30 Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.964712 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="8e18fbbd-a304-422e-8c13-88ab08fef424" containerName="ovsdbserver-sb" probeResult="failure" output="" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.971169 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llr9r\" (UniqueName: \"kubernetes.io/projected/c305bbe8-a238-40ff-9b8f-2d075cb528e8-kube-api-access-llr9r\") pod \"novacell1d49f-account-delete-kt6sb\" (UID: \"c305bbe8-a238-40ff-9b8f-2d075cb528e8\") " pod="openstack/novacell1d49f-account-delete-kt6sb" Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.971311 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c305bbe8-a238-40ff-9b8f-2d075cb528e8-operator-scripts\") pod \"novacell1d49f-account-delete-kt6sb\" (UID: \"c305bbe8-a238-40ff-9b8f-2d075cb528e8\") " pod="openstack/novacell1d49f-account-delete-kt6sb" Dec 09 11:57:44 crc kubenswrapper[4745]: E1209 11:57:44.971532 4745 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Dec 09 11:57:44 crc kubenswrapper[4745]: E1209 11:57:44.971583 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c305bbe8-a238-40ff-9b8f-2d075cb528e8-operator-scripts podName:c305bbe8-a238-40ff-9b8f-2d075cb528e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:57:45.471567014 +0000 UTC m=+1552.296768538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c305bbe8-a238-40ff-9b8f-2d075cb528e8-operator-scripts") pod "novacell1d49f-account-delete-kt6sb" (UID: "c305bbe8-a238-40ff-9b8f-2d075cb528e8") : configmap "openstack-cell1-scripts" not found Dec 09 11:57:44 crc kubenswrapper[4745]: I1209 11:57:44.994933 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-r7bpq"] Dec 09 11:57:44 crc kubenswrapper[4745]: E1209 11:57:44.996437 4745 projected.go:194] Error preparing data for projected volume kube-api-access-llr9r for pod openstack/novacell1d49f-account-delete-kt6sb: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Dec 09 11:57:44 crc kubenswrapper[4745]: E1209 11:57:44.996639 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c305bbe8-a238-40ff-9b8f-2d075cb528e8-kube-api-access-llr9r podName:c305bbe8-a238-40ff-9b8f-2d075cb528e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:57:45.496610829 +0000 UTC m=+1552.321812343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-llr9r" (UniqueName: "kubernetes.io/projected/c305bbe8-a238-40ff-9b8f-2d075cb528e8-kube-api-access-llr9r") pod "novacell1d49f-account-delete-kt6sb" (UID: "c305bbe8-a238-40ff-9b8f-2d075cb528e8") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.021619 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican83f7-account-delete-5vbd2" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.037251 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-dhdw4"] Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.071112 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell1d49f-account-delete-kt6sb"] Dec 09 11:57:45 crc kubenswrapper[4745]: E1209 11:57:45.077739 4745 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 09 11:57:45 crc kubenswrapper[4745]: E1209 11:57:45.077824 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-config-data podName:5b860955-30eb-40e6-bd56-caf6098aed8a nodeName:}" failed. No retries permitted until 2025-12-09 11:57:47.077797398 +0000 UTC m=+1553.902998922 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-config-data") pod "rabbitmq-server-0" (UID: "5b860955-30eb-40e6-bd56-caf6098aed8a") : configmap "rabbitmq-config-data" not found Dec 09 11:57:45 crc kubenswrapper[4745]: E1209 11:57:45.103442 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 09 11:57:45 crc kubenswrapper[4745]: E1209 11:57:45.120732 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.147113 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.147457 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="eaef3c48-5e7c-4ea3-a2d0-da44ea528455" containerName="cinder-scheduler" containerID="cri-o://85b41109811ea6cff92fc56baafb1484f3284cd5b7dfd75355171bc8db15a025" gracePeriod=30 Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.147939 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="eaef3c48-5e7c-4ea3-a2d0-da44ea528455" containerName="probe" containerID="cri-o://7a0d9d93ae67ba5d6ed408c67d268af4c670890fd02fafc91373ab63fe2182ca" gracePeriod=30 Dec 09 11:57:45 crc kubenswrapper[4745]: E1209 11:57:45.149032 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 09 11:57:45 crc kubenswrapper[4745]: E1209 11:57:45.149160 4745 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="8b9f661d-4261-4e54-883d-cb0e7479a3d2" containerName="ovn-northd" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.217243 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.276771 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi03c2-account-delete-4b54q"] Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.401367 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi03c2-account-delete-4b54q" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.442424 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.442875 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0452bd55-b0a3-46a9-a388-6db2e40f4cb7" containerName="glance-log" containerID="cri-o://0d37cddfc8eb090792805f604b6641cd1c3d6607edd7eead42610629d259af0b" gracePeriod=30 Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.443702 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0452bd55-b0a3-46a9-a388-6db2e40f4cb7" containerName="glance-httpd" containerID="cri-o://6d7b68aa247edb856bd53e3fca4235d3708496404ba738f128d0babd51b390df" gracePeriod=30 Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.508951 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ceba626e-26d1-495f-b88d-fed69e445ddb" containerName="rabbitmq" containerID="cri-o://8dae8dcac1defac8c58cb335ba486b55e9b6076bf8990bbbc17743c619b726cb" gracePeriod=604800 Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.546157 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c305bbe8-a238-40ff-9b8f-2d075cb528e8-operator-scripts\") pod \"novacell1d49f-account-delete-kt6sb\" (UID: \"c305bbe8-a238-40ff-9b8f-2d075cb528e8\") " pod="openstack/novacell1d49f-account-delete-kt6sb" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.546253 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-488h7\" (UniqueName: \"kubernetes.io/projected/0da87c5d-f709-4d9b-b182-421edbb61f00-kube-api-access-488h7\") pod \"novaapi03c2-account-delete-4b54q\" (UID: \"0da87c5d-f709-4d9b-b182-421edbb61f00\") " pod="openstack/novaapi03c2-account-delete-4b54q" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.546307 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0da87c5d-f709-4d9b-b182-421edbb61f00-operator-scripts\") pod \"novaapi03c2-account-delete-4b54q\" (UID: \"0da87c5d-f709-4d9b-b182-421edbb61f00\") " pod="openstack/novaapi03c2-account-delete-4b54q" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.546366 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llr9r\" (UniqueName: \"kubernetes.io/projected/c305bbe8-a238-40ff-9b8f-2d075cb528e8-kube-api-access-llr9r\") pod \"novacell1d49f-account-delete-kt6sb\" (UID: \"c305bbe8-a238-40ff-9b8f-2d075cb528e8\") " pod="openstack/novacell1d49f-account-delete-kt6sb" Dec 09 11:57:45 crc kubenswrapper[4745]: E1209 11:57:45.546821 4745 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Dec 09 11:57:45 crc kubenswrapper[4745]: E1209 11:57:45.546872 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c305bbe8-a238-40ff-9b8f-2d075cb528e8-operator-scripts podName:c305bbe8-a238-40ff-9b8f-2d075cb528e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:57:46.546855475 +0000 UTC m=+1553.372056999 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c305bbe8-a238-40ff-9b8f-2d075cb528e8-operator-scripts") pod "novacell1d49f-account-delete-kt6sb" (UID: "c305bbe8-a238-40ff-9b8f-2d075cb528e8") : configmap "openstack-cell1-scripts" not found Dec 09 11:57:45 crc kubenswrapper[4745]: E1209 11:57:45.558058 4745 projected.go:194] Error preparing data for projected volume kube-api-access-llr9r for pod openstack/novacell1d49f-account-delete-kt6sb: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Dec 09 11:57:45 crc kubenswrapper[4745]: E1209 11:57:45.558153 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c305bbe8-a238-40ff-9b8f-2d075cb528e8-kube-api-access-llr9r podName:c305bbe8-a238-40ff-9b8f-2d075cb528e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:57:46.558125459 +0000 UTC m=+1553.383326983 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-llr9r" (UniqueName: "kubernetes.io/projected/c305bbe8-a238-40ff-9b8f-2d075cb528e8-kube-api-access-llr9r") pod "novacell1d49f-account-delete-kt6sb" (UID: "c305bbe8-a238-40ff-9b8f-2d075cb528e8") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.648502 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-488h7\" (UniqueName: \"kubernetes.io/projected/0da87c5d-f709-4d9b-b182-421edbb61f00-kube-api-access-488h7\") pod \"novaapi03c2-account-delete-4b54q\" (UID: \"0da87c5d-f709-4d9b-b182-421edbb61f00\") " pod="openstack/novaapi03c2-account-delete-4b54q" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.649525 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0da87c5d-f709-4d9b-b182-421edbb61f00-operator-scripts\") pod \"novaapi03c2-account-delete-4b54q\" (UID: \"0da87c5d-f709-4d9b-b182-421edbb61f00\") " pod="openstack/novaapi03c2-account-delete-4b54q" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.653073 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0da87c5d-f709-4d9b-b182-421edbb61f00-operator-scripts\") pod \"novaapi03c2-account-delete-4b54q\" (UID: \"0da87c5d-f709-4d9b-b182-421edbb61f00\") " pod="openstack/novaapi03c2-account-delete-4b54q" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.679323 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26c851b2-74b9-43d9-82be-3ee896408d78" path="/var/lib/kubelet/pods/26c851b2-74b9-43d9-82be-3ee896408d78/volumes" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.686648 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d83c2d24-3c0c-4097-afd8-1649e08665e4/ovsdbserver-nb/0.log" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.686756 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.696117 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a305c7-8afb-4b56-90b1-e071980fbcdd" path="/var/lib/kubelet/pods/72a305c7-8afb-4b56-90b1-e071980fbcdd/volumes" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.716021 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b162675-ea0a-4433-a94a-1f5bd6c81e01" path="/var/lib/kubelet/pods/8b162675-ea0a-4433-a94a-1f5bd6c81e01/volumes" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.724795 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa1920d0-1725-4be7-baa4-e6561fcce10c" path="/var/lib/kubelet/pods/aa1920d0-1725-4be7-baa4-e6561fcce10c/volumes" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.729473 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-488h7\" (UniqueName: \"kubernetes.io/projected/0da87c5d-f709-4d9b-b182-421edbb61f00-kube-api-access-488h7\") pod \"novaapi03c2-account-delete-4b54q\" (UID: \"0da87c5d-f709-4d9b-b182-421edbb61f00\") " pod="openstack/novaapi03c2-account-delete-4b54q" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.756473 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf657ee5-9433-4e1a-9a66-33f59c0e5b0a" path="/var/lib/kubelet/pods/bf657ee5-9433-4e1a-9a66-33f59c0e5b0a/volumes" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.760496 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c20c4e93-62b6-4c52-9783-eca25d00c91f" path="/var/lib/kubelet/pods/c20c4e93-62b6-4c52-9783-eca25d00c91f/volumes" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.796410 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d34f458f-6a4f-416a-96b9-18dfd1bb1452" path="/var/lib/kubelet/pods/d34f458f-6a4f-416a-96b9-18dfd1bb1452/volumes" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.799326 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcdefb66-0234-4aa7-97e4-bba6107a3e7d" path="/var/lib/kubelet/pods/dcdefb66-0234-4aa7-97e4-bba6107a3e7d/volumes" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.800233 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.801764 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi03c2-account-delete-4b54q"] Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.801849 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.802287 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="73ed8df4-e28f-4c76-bca1-a3d77ef789d4" containerName="glance-log" containerID="cri-o://892538c48ca05829a077bcf325c90ee4bb55781acc43871ab0ffe0358d7af1b9" gracePeriod=30 Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.802739 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="48a868ce-c1ab-457a-bd7f-224f8e982a13" containerName="cinder-api-log" containerID="cri-o://3cb8e987a7a6b5a17dc3a1ce4421d7a97a43c5bcf0226be502cadc9674ba1fdf" gracePeriod=30 Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.803266 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="73ed8df4-e28f-4c76-bca1-a3d77ef789d4" containerName="glance-httpd" containerID="cri-o://a9d2e5091d7aefc5b0d929913437d6ded33f50f234ba6c80915715daf68d74db" gracePeriod=30 Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.803588 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="48a868ce-c1ab-457a-bd7f-224f8e982a13" containerName="cinder-api" containerID="cri-o://84d0dbf76c2ff2ed4c9dc4cbbf8300c430928bdea45eabd90653242d68969b38" gracePeriod=30 Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.828153 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell0e536-account-delete-zsxfl"] Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.835774 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dqr9b" Dec 09 11:57:45 crc kubenswrapper[4745]: E1209 11:57:45.835787 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d83c2d24-3c0c-4097-afd8-1649e08665e4" containerName="openstack-network-exporter" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.841933 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d83c2d24-3c0c-4097-afd8-1649e08665e4" containerName="openstack-network-exporter" Dec 09 11:57:45 crc kubenswrapper[4745]: E1209 11:57:45.842149 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d83c2d24-3c0c-4097-afd8-1649e08665e4" containerName="ovsdbserver-nb" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.842253 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d83c2d24-3c0c-4097-afd8-1649e08665e4" containerName="ovsdbserver-nb" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.847878 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcea58c8-1e21-4581-bcdb-7b1f88e8b463" containerName="ovn-controller" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.848080 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d83c2d24-3c0c-4097-afd8-1649e08665e4" containerName="ovsdbserver-nb" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.848180 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d83c2d24-3c0c-4097-afd8-1649e08665e4" containerName="openstack-network-exporter" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.849263 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0e536-account-delete-zsxfl" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.859929 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d83c2d24-3c0c-4097-afd8-1649e08665e4-config\") pod \"d83c2d24-3c0c-4097-afd8-1649e08665e4\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.859985 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srvvj\" (UniqueName: \"kubernetes.io/projected/d83c2d24-3c0c-4097-afd8-1649e08665e4-kube-api-access-srvvj\") pod \"d83c2d24-3c0c-4097-afd8-1649e08665e4\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.860082 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-metrics-certs-tls-certs\") pod \"d83c2d24-3c0c-4097-afd8-1649e08665e4\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.861470 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d83c2d24-3c0c-4097-afd8-1649e08665e4-config" (OuterVolumeSpecName: "config") pod "d83c2d24-3c0c-4097-afd8-1649e08665e4" (UID: "d83c2d24-3c0c-4097-afd8-1649e08665e4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.872845 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d83c2d24-3c0c-4097-afd8-1649e08665e4-kube-api-access-srvvj" (OuterVolumeSpecName: "kube-api-access-srvvj") pod "d83c2d24-3c0c-4097-afd8-1649e08665e4" (UID: "d83c2d24-3c0c-4097-afd8-1649e08665e4"). InnerVolumeSpecName "kube-api-access-srvvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.876971 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-run-ovn\") pod \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.877097 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "dcea58c8-1e21-4581-bcdb-7b1f88e8b463" (UID: "dcea58c8-1e21-4581-bcdb-7b1f88e8b463"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.877188 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-run\") pod \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.877361 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"d83c2d24-3c0c-4097-afd8-1649e08665e4\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.877546 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-ovn-controller-tls-certs\") pod \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.877635 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-log-ovn\") pod \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.877758 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d83c2d24-3c0c-4097-afd8-1649e08665e4-scripts\") pod \"d83c2d24-3c0c-4097-afd8-1649e08665e4\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.877856 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrdwg\" (UniqueName: \"kubernetes.io/projected/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-kube-api-access-mrdwg\") pod \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.877953 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-ovsdbserver-nb-tls-certs\") pod \"d83c2d24-3c0c-4097-afd8-1649e08665e4\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.878058 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-combined-ca-bundle\") pod \"d83c2d24-3c0c-4097-afd8-1649e08665e4\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.878149 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-combined-ca-bundle\") pod \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.878274 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d83c2d24-3c0c-4097-afd8-1649e08665e4-ovsdb-rundir\") pod \"d83c2d24-3c0c-4097-afd8-1649e08665e4\" (UID: \"d83c2d24-3c0c-4097-afd8-1649e08665e4\") " Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.877231 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-run" (OuterVolumeSpecName: "var-run") pod "dcea58c8-1e21-4581-bcdb-7b1f88e8b463" (UID: "dcea58c8-1e21-4581-bcdb-7b1f88e8b463"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.878993 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d83c2d24-3c0c-4097-afd8-1649e08665e4-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "d83c2d24-3c0c-4097-afd8-1649e08665e4" (UID: "d83c2d24-3c0c-4097-afd8-1649e08665e4"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.890036 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "dcea58c8-1e21-4581-bcdb-7b1f88e8b463" (UID: "dcea58c8-1e21-4581-bcdb-7b1f88e8b463"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.891387 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.891751 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c180eac0-e93f-4067-ba6d-32a023f424e6" containerName="nova-metadata-log" containerID="cri-o://54a2edbbc1397cc5ce4a40d2a57015e139f31d6f9f4b30325e3a940d377baabd" gracePeriod=30 Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.891939 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c180eac0-e93f-4067-ba6d-32a023f424e6" containerName="nova-metadata-metadata" containerID="cri-o://2b371925777f6434276ab7a74c14a8e465cf580329fd7735e425f6016bedfb3b" gracePeriod=30 Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.895614 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-dw6lx_52964bb1-2d93-4df7-afbc-95f1eb10b8fc/openstack-network-exporter/0.log" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.895725 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.898271 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d83c2d24-3c0c-4097-afd8-1649e08665e4-scripts" (OuterVolumeSpecName: "scripts") pod "d83c2d24-3c0c-4097-afd8-1649e08665e4" (UID: "d83c2d24-3c0c-4097-afd8-1649e08665e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.903733 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.917343 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.920923 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi03c2-account-delete-4b54q" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.922399 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e822e9c0-d6fa-4880-a0e3-8dfb32405a6f-operator-scripts\") pod \"novacell0e536-account-delete-zsxfl\" (UID: \"e822e9c0-d6fa-4880-a0e3-8dfb32405a6f\") " pod="openstack/novacell0e536-account-delete-zsxfl" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.924280 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6q64\" (UniqueName: \"kubernetes.io/projected/e822e9c0-d6fa-4880-a0e3-8dfb32405a6f-kube-api-access-c6q64\") pod \"novacell0e536-account-delete-zsxfl\" (UID: \"e822e9c0-d6fa-4880-a0e3-8dfb32405a6f\") " pod="openstack/novacell0e536-account-delete-zsxfl" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.938012 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d83c2d24-3c0c-4097-afd8-1649e08665e4-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.938226 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d83c2d24-3c0c-4097-afd8-1649e08665e4-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.938318 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srvvj\" (UniqueName: \"kubernetes.io/projected/d83c2d24-3c0c-4097-afd8-1649e08665e4-kube-api-access-srvvj\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.938525 4745 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.938626 4745 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-run\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.938720 4745 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.938874 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d83c2d24-3c0c-4097-afd8-1649e08665e4-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.962248 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0e536-account-delete-zsxfl"] Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.964160 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "d83c2d24-3c0c-4097-afd8-1649e08665e4" (UID: "d83c2d24-3c0c-4097-afd8-1649e08665e4"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 11:57:45 crc kubenswrapper[4745]: I1209 11:57:45.964277 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-kube-api-access-mrdwg" (OuterVolumeSpecName: "kube-api-access-mrdwg") pod "dcea58c8-1e21-4581-bcdb-7b1f88e8b463" (UID: "dcea58c8-1e21-4581-bcdb-7b1f88e8b463"). InnerVolumeSpecName "kube-api-access-mrdwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009571 4745 generic.go:334] "Generic (PLEG): container finished" podID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerID="2bef905b24c86bdd149cc3b8f821b124b807b8515c256b0a8c7c6263b9eedcd7" exitCode=0 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009609 4745 generic.go:334] "Generic (PLEG): container finished" podID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerID="a09eef6ded4878a8e32d6ba9cbd3dd19b6b7bfeec260ad7cfb501f71b11f4620" exitCode=0 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009627 4745 generic.go:334] "Generic (PLEG): container finished" podID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerID="3b09c85bd7e17fb217f98509b4b794244fe5e3f0c26306c7c690e6a051489972" exitCode=0 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009636 4745 generic.go:334] "Generic (PLEG): container finished" podID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerID="efabc77007bbbd62d845864253d28ab16751a3875bb948814d9433ce5750dc12" exitCode=0 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009645 4745 generic.go:334] "Generic (PLEG): container finished" podID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerID="7b15301db669e893c5e60abe636eaa9c44f3893adf6b79d1697d2915c154c463" exitCode=0 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009652 4745 generic.go:334] "Generic (PLEG): container finished" podID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerID="452aca3b6e3787e1a6aba6c354db611a2345a39e27b0e120f6ddf7586fe81934" exitCode=0 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009660 4745 generic.go:334] "Generic (PLEG): container finished" podID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerID="c4b12f433424215b3190a84a7a70334fc6b3243836fe4b1b5a6c2aecc6c4f26b" exitCode=0 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009667 4745 generic.go:334] "Generic (PLEG): container finished" podID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerID="affba02167b7f1a39157024b3315fc6007c26612f881997d6a5c2ee7f8c3a031" exitCode=0 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009676 4745 generic.go:334] "Generic (PLEG): container finished" podID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerID="d1c5baae9ca212385d6390766968bb5c5c817b622698a6ebd63ce2e07c088b11" exitCode=0 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009685 4745 generic.go:334] "Generic (PLEG): container finished" podID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerID="159077530d8b2bbf20cfe2054dbf825f2d68d49f59c36639bd872289904ce3f6" exitCode=0 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009692 4745 generic.go:334] "Generic (PLEG): container finished" podID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerID="897333cc8f92ee902c0c7c820dc78dbe5dd0956289175db6c2e6cfd0c73135bd" exitCode=0 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009699 4745 generic.go:334] "Generic (PLEG): container finished" podID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerID="864f05e319634c7f2e50a7ad6d5981f327821fb1cfb3fd4844b58c7d6634eebb" exitCode=0 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009707 4745 generic.go:334] "Generic (PLEG): container finished" podID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerID="9331827add7fc120aedf6b88798a58ce524ffc136ae60e9e6ba61e68c91d8281" exitCode=0 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009715 4745 generic.go:334] "Generic (PLEG): container finished" podID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerID="949bb722de609413bb37a922ca3be73fb6c7a4916d0fe96c8b8b977d069ac462" exitCode=0 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009776 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerDied","Data":"2bef905b24c86bdd149cc3b8f821b124b807b8515c256b0a8c7c6263b9eedcd7"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009813 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerDied","Data":"a09eef6ded4878a8e32d6ba9cbd3dd19b6b7bfeec260ad7cfb501f71b11f4620"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009826 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerDied","Data":"3b09c85bd7e17fb217f98509b4b794244fe5e3f0c26306c7c690e6a051489972"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009834 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerDied","Data":"efabc77007bbbd62d845864253d28ab16751a3875bb948814d9433ce5750dc12"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.009843 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerDied","Data":"7b15301db669e893c5e60abe636eaa9c44f3893adf6b79d1697d2915c154c463"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.010851 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerDied","Data":"452aca3b6e3787e1a6aba6c354db611a2345a39e27b0e120f6ddf7586fe81934"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.010894 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerDied","Data":"c4b12f433424215b3190a84a7a70334fc6b3243836fe4b1b5a6c2aecc6c4f26b"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.010905 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerDied","Data":"affba02167b7f1a39157024b3315fc6007c26612f881997d6a5c2ee7f8c3a031"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.010914 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerDied","Data":"d1c5baae9ca212385d6390766968bb5c5c817b622698a6ebd63ce2e07c088b11"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.010925 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerDied","Data":"159077530d8b2bbf20cfe2054dbf825f2d68d49f59c36639bd872289904ce3f6"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.010933 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerDied","Data":"897333cc8f92ee902c0c7c820dc78dbe5dd0956289175db6c2e6cfd0c73135bd"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.010942 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerDied","Data":"864f05e319634c7f2e50a7ad6d5981f327821fb1cfb3fd4844b58c7d6634eebb"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.010950 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerDied","Data":"9331827add7fc120aedf6b88798a58ce524ffc136ae60e9e6ba61e68c91d8281"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.010995 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerDied","Data":"949bb722de609413bb37a922ca3be73fb6c7a4916d0fe96c8b8b977d069ac462"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.017425 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="5b860955-30eb-40e6-bd56-caf6098aed8a" containerName="rabbitmq" containerID="cri-o://31bf789bc4e44645e0b834e648dabfc8f57c6ad93e2976fce50cb6120b8850cf" gracePeriod=604800 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.020433 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-55565d45c6-5hsz5"] Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.020688 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" podUID="4658eac8-46b0-448b-8bc7-7c783fcef1c6" containerName="barbican-keystone-listener-log" containerID="cri-o://8c3a126676cf77a7477f9d5072236397973a255f712333be799d5fe817503fba" gracePeriod=30 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.021071 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" podUID="4658eac8-46b0-448b-8bc7-7c783fcef1c6" containerName="barbican-keystone-listener" containerID="cri-o://84c6b918c9652ae6342e8c33b38c40a7337bb5caf0afd25ca43e70b2c29d044c" gracePeriod=30 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.038640 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.038871 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="392b878a-37ba-4887-a699-672c8b92e947" containerName="nova-scheduler-scheduler" containerID="cri-o://4362cdcbe28cc87f9a30f3b6d968ff0b65e8991acf95fb8eeec6d303cfb21ea1" gracePeriod=30 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.041421 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86sck\" (UniqueName: \"kubernetes.io/projected/2f843182-a85c-47cf-ba16-414c40a031c5-kube-api-access-86sck\") pod \"2f843182-a85c-47cf-ba16-414c40a031c5\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.041707 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-config\") pod \"2f843182-a85c-47cf-ba16-414c40a031c5\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.041819 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-dns-svc\") pod \"2f843182-a85c-47cf-ba16-414c40a031c5\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.042012 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-ovs-rundir\") pod \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.042250 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-ovsdbserver-nb\") pod \"2f843182-a85c-47cf-ba16-414c40a031c5\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.042362 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-config\") pod \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.042457 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-ovsdbserver-sb\") pod \"2f843182-a85c-47cf-ba16-414c40a031c5\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.042586 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-ovn-rundir\") pod \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.064163 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-dns-swift-storage-0\") pod \"2f843182-a85c-47cf-ba16-414c40a031c5\" (UID: \"2f843182-a85c-47cf-ba16-414c40a031c5\") " Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.064419 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrb5g\" (UniqueName: \"kubernetes.io/projected/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-kube-api-access-vrb5g\") pod \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.064728 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-metrics-certs-tls-certs\") pod \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.064853 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-combined-ca-bundle\") pod \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\" (UID: \"52964bb1-2d93-4df7-afbc-95f1eb10b8fc\") " Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.064978 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-scripts\") pod \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\" (UID: \"dcea58c8-1e21-4581-bcdb-7b1f88e8b463\") " Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.065546 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d83c2d24-3c0c-4097-afd8-1649e08665e4/ovsdbserver-nb/0.log" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.065729 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.068667 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e822e9c0-d6fa-4880-a0e3-8dfb32405a6f-operator-scripts\") pod \"novacell0e536-account-delete-zsxfl\" (UID: \"e822e9c0-d6fa-4880-a0e3-8dfb32405a6f\") " pod="openstack/novacell0e536-account-delete-zsxfl" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.043433 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-57876487f8-zgj8m"] Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.074738 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.074773 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d83c2d24-3c0c-4097-afd8-1649e08665e4","Type":"ContainerDied","Data":"9c5f9496854fd26bc519c9bec4075b075f278b18226b402ed3b9fccc9d940e12"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.074819 4745 scope.go:117] "RemoveContainer" containerID="82e76a7787b8495afefaced3bf46a59645fcc2228c694a5e7e6a738e92d9f044" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.073951 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6q64\" (UniqueName: \"kubernetes.io/projected/e822e9c0-d6fa-4880-a0e3-8dfb32405a6f-kube-api-access-c6q64\") pod \"novacell0e536-account-delete-zsxfl\" (UID: \"e822e9c0-d6fa-4880-a0e3-8dfb32405a6f\") " pod="openstack/novacell0e536-account-delete-zsxfl" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.075959 4745 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.075989 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrdwg\" (UniqueName: \"kubernetes.io/projected/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-kube-api-access-mrdwg\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.077228 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e822e9c0-d6fa-4880-a0e3-8dfb32405a6f-operator-scripts\") pod \"novacell0e536-account-delete-zsxfl\" (UID: \"e822e9c0-d6fa-4880-a0e3-8dfb32405a6f\") " pod="openstack/novacell0e536-account-delete-zsxfl" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.078388 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-57876487f8-zgj8m" podUID="600c553e-f8e5-4ec6-94e7-2981abc748cb" containerName="barbican-api-log" containerID="cri-o://76e07b4d01b40ce6fe2153747a2637a847f6695c48e7781c037a25626d453609" gracePeriod=30 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.044797 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "52964bb1-2d93-4df7-afbc-95f1eb10b8fc" (UID: "52964bb1-2d93-4df7-afbc-95f1eb10b8fc"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.045945 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d83c2d24-3c0c-4097-afd8-1649e08665e4" (UID: "d83c2d24-3c0c-4097-afd8-1649e08665e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.062259 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "52964bb1-2d93-4df7-afbc-95f1eb10b8fc" (UID: "52964bb1-2d93-4df7-afbc-95f1eb10b8fc"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.070832 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcea58c8-1e21-4581-bcdb-7b1f88e8b463" (UID: "dcea58c8-1e21-4581-bcdb-7b1f88e8b463"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.071771 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-config" (OuterVolumeSpecName: "config") pod "52964bb1-2d93-4df7-afbc-95f1eb10b8fc" (UID: "52964bb1-2d93-4df7-afbc-95f1eb10b8fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.078681 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-scripts" (OuterVolumeSpecName: "scripts") pod "dcea58c8-1e21-4581-bcdb-7b1f88e8b463" (UID: "dcea58c8-1e21-4581-bcdb-7b1f88e8b463"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.079319 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-57876487f8-zgj8m" podUID="600c553e-f8e5-4ec6-94e7-2981abc748cb" containerName="barbican-api" containerID="cri-o://9b2f931241d0299ec3b05ca17d1893191def91b2abcc6ab8e43b6350e1923813" gracePeriod=30 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.089227 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-zdn2w" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovsdb-server" containerID="cri-o://b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" gracePeriod=28 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.092357 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.093436 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c9e009b8-8eec-4028-ade9-84bc49d236c8" containerName="nova-api-log" containerID="cri-o://0959a307a96a3c8b986b8df7cc54318c82c7451c29bafc311bb4eafa178c8154" gracePeriod=30 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.093821 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c9e009b8-8eec-4028-ade9-84bc49d236c8" containerName="nova-api-api" containerID="cri-o://4d0d0764d9f0fe5167222401d27b58294f16f2e351cb70a946140ba51e127793" gracePeriod=30 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.097444 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-dw6lx_52964bb1-2d93-4df7-afbc-95f1eb10b8fc/openstack-network-exporter/0.log" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.097546 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-dw6lx" event={"ID":"52964bb1-2d93-4df7-afbc-95f1eb10b8fc","Type":"ContainerDied","Data":"b7740d4a42e4f85b15f623e06a6befc52cc38d311b5742d6e8f3cac547740344"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.097640 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-dw6lx" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.132584 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f843182-a85c-47cf-ba16-414c40a031c5-kube-api-access-86sck" (OuterVolumeSpecName: "kube-api-access-86sck") pod "2f843182-a85c-47cf-ba16-414c40a031c5" (UID: "2f843182-a85c-47cf-ba16-414c40a031c5"). InnerVolumeSpecName "kube-api-access-86sck". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.135130 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7cf679bcc-g65zr"] Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.135988 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7cf679bcc-g65zr" podUID="da94483d-f361-42ef-95b4-d4b2c79b4d80" containerName="barbican-worker-log" containerID="cri-o://247fcd5449ca5281d6d3e9c9cde028019450ce328caca6d4d39e7b414171ced2" gracePeriod=30 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.136629 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7cf679bcc-g65zr" podUID="da94483d-f361-42ef-95b4-d4b2c79b4d80" containerName="barbican-worker" containerID="cri-o://09651a3667881d281a4da0af523c62fd762af27f395c545a4af9df337fa048bd" gracePeriod=30 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.144900 4745 generic.go:334] "Generic (PLEG): container finished" podID="25eecd8f-8b17-4e74-b651-78948c627127" containerID="c1072a6b7c6b1a8d03366bc7d0ddbbd40587331a7fd99d03c872e65df3f9946a" exitCode=143 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.144980 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7db6b497c6-z9vtr" event={"ID":"25eecd8f-8b17-4e74-b651-78948c627127","Type":"ContainerDied","Data":"c1072a6b7c6b1a8d03366bc7d0ddbbd40587331a7fd99d03c872e65df3f9946a"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.156763 4745 generic.go:334] "Generic (PLEG): container finished" podID="0452bd55-b0a3-46a9-a388-6db2e40f4cb7" containerID="0d37cddfc8eb090792805f604b6641cd1c3d6607edd7eead42610629d259af0b" exitCode=143 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.156860 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0452bd55-b0a3-46a9-a388-6db2e40f4cb7","Type":"ContainerDied","Data":"0d37cddfc8eb090792805f604b6641cd1c3d6607edd7eead42610629d259af0b"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.166463 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.166998 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="2193006f-4e85-4b55-a6ab-9237f4c9888f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f966743286a79147a7637b37640d7567a316c8d3352249f62fcd961a1b853aed" gracePeriod=30 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.179125 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-kube-api-access-vrb5g" (OuterVolumeSpecName: "kube-api-access-vrb5g") pod "52964bb1-2d93-4df7-afbc-95f1eb10b8fc" (UID: "52964bb1-2d93-4df7-afbc-95f1eb10b8fc"). InnerVolumeSpecName "kube-api-access-vrb5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.185447 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6q64\" (UniqueName: \"kubernetes.io/projected/e822e9c0-d6fa-4880-a0e3-8dfb32405a6f-kube-api-access-c6q64\") pod \"novacell0e536-account-delete-zsxfl\" (UID: \"e822e9c0-d6fa-4880-a0e3-8dfb32405a6f\") " pod="openstack/novacell0e536-account-delete-zsxfl" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.186851 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8e18fbbd-a304-422e-8c13-88ab08fef424/ovsdbserver-sb/0.log" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.186921 4745 generic.go:334] "Generic (PLEG): container finished" podID="8e18fbbd-a304-422e-8c13-88ab08fef424" containerID="f5ff7ad1034757488e9fa6ef3e4534c7b33ad6d4824c61d216c9946583e12257" exitCode=2 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.186946 4745 generic.go:334] "Generic (PLEG): container finished" podID="8e18fbbd-a304-422e-8c13-88ab08fef424" containerID="40aa5562bd3f9aae5ac913d05ca874ce263e3c959881973ff50135ec376a86f0" exitCode=143 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.187014 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8e18fbbd-a304-422e-8c13-88ab08fef424","Type":"ContainerDied","Data":"f5ff7ad1034757488e9fa6ef3e4534c7b33ad6d4824c61d216c9946583e12257"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.187062 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8e18fbbd-a304-422e-8c13-88ab08fef424","Type":"ContainerDied","Data":"40aa5562bd3f9aae5ac913d05ca874ce263e3c959881973ff50135ec376a86f0"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.201765 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dqr9b" event={"ID":"dcea58c8-1e21-4581-bcdb-7b1f88e8b463","Type":"ContainerDied","Data":"a0808d7f62ef2d56de18068d7de9d014400595e957f3924e71ce402e04b08c06"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.202002 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dqr9b" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.203394 4745 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.203441 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrb5g\" (UniqueName: \"kubernetes.io/projected/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-kube-api-access-vrb5g\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.203455 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.203467 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86sck\" (UniqueName: \"kubernetes.io/projected/2f843182-a85c-47cf-ba16-414c40a031c5-kube-api-access-86sck\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.203482 4745 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-ovs-rundir\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.203491 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.203502 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.203922 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.205768 4745 generic.go:334] "Generic (PLEG): container finished" podID="ac238b52-4167-4847-b66f-6985b784268c" containerID="dc6f10acb25d86858fe8f0ee4fbcf99c84e675bea1bcdd1d53f89781a1482c42" exitCode=137 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.214851 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-vhzt5"] Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.224378 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0e536-account-delete-zsxfl" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.226115 4745 generic.go:334] "Generic (PLEG): container finished" podID="2f843182-a85c-47cf-ba16-414c40a031c5" containerID="543eb1f623aed841beae952f8e46e93a8e7442d03c588d9a15fad9c843e15cc2" exitCode=0 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.226220 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" event={"ID":"2f843182-a85c-47cf-ba16-414c40a031c5","Type":"ContainerDied","Data":"543eb1f623aed841beae952f8e46e93a8e7442d03c588d9a15fad9c843e15cc2"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.226927 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" event={"ID":"2f843182-a85c-47cf-ba16-414c40a031c5","Type":"ContainerDied","Data":"668cfd4de6cfc22ce653d3913709f0add8a26ece3211b567d5a1b8ea2048775a"} Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.227013 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-867cd545c7-gxt6p" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.237783 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-zdn2w" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovs-vswitchd" containerID="cri-o://d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" gracePeriod=28 Dec 09 11:57:46 crc kubenswrapper[4745]: E1209 11:57:46.307272 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:57:46 crc kubenswrapper[4745]: E1209 11:57:46.336765 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:57:46 crc kubenswrapper[4745]: E1209 11:57:46.337260 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:57:46 crc kubenswrapper[4745]: E1209 11:57:46.365689 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:57:46 crc kubenswrapper[4745]: E1209 11:57:46.365788 4745 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-zdn2w" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovsdb-server" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.365704 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="ceba626e-26d1-495f-b88d-fed69e445ddb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.385874 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-vhzt5"] Dec 09 11:57:46 crc kubenswrapper[4745]: E1209 11:57:46.411869 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:57:46 crc kubenswrapper[4745]: E1209 11:57:46.445256 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:57:46 crc kubenswrapper[4745]: E1209 11:57:46.445646 4745 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-zdn2w" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovs-vswitchd" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.451815 4745 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.451997 4745 scope.go:117] "RemoveContainer" containerID="bff23da6e6763af0c2998317017091c2a8ad2169bcc47bd632c82d8992462e86" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.460207 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-d49f-account-create-update-kqsjx"] Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.479474 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-d49f-account-create-update-kqsjx"] Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.499057 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d83c2d24-3c0c-4097-afd8-1649e08665e4" (UID: "d83c2d24-3c0c-4097-afd8-1649e08665e4"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.500329 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1d49f-account-delete-kt6sb"] Dec 09 11:57:46 crc kubenswrapper[4745]: E1209 11:57:46.501287 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-llr9r operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/novacell1d49f-account-delete-kt6sb" podUID="c305bbe8-a238-40ff-9b8f-2d075cb528e8" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.545487 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.545539 4745 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: E1209 11:57:46.545626 4745 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 09 11:57:46 crc kubenswrapper[4745]: E1209 11:57:46.545686 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-config-data podName:ceba626e-26d1-495f-b88d-fed69e445ddb nodeName:}" failed. No retries permitted until 2025-12-09 11:57:50.545667257 +0000 UTC m=+1557.370868771 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-config-data") pod "rabbitmq-cell1-server-0" (UID: "ceba626e-26d1-495f-b88d-fed69e445ddb") : configmap "rabbitmq-cell1-config-data" not found Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.598812 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f843182-a85c-47cf-ba16-414c40a031c5" (UID: "2f843182-a85c-47cf-ba16-414c40a031c5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.631991 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zd8rf"] Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.648551 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "d83c2d24-3c0c-4097-afd8-1649e08665e4" (UID: "d83c2d24-3c0c-4097-afd8-1649e08665e4"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: E1209 11:57:46.649727 4745 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Dec 09 11:57:46 crc kubenswrapper[4745]: E1209 11:57:46.651149 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c305bbe8-a238-40ff-9b8f-2d075cb528e8-operator-scripts podName:c305bbe8-a238-40ff-9b8f-2d075cb528e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:57:48.65112594 +0000 UTC m=+1555.476327464 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c305bbe8-a238-40ff-9b8f-2d075cb528e8-operator-scripts") pod "novacell1d49f-account-delete-kt6sb" (UID: "c305bbe8-a238-40ff-9b8f-2d075cb528e8") : configmap "openstack-cell1-scripts" not found Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.649636 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c305bbe8-a238-40ff-9b8f-2d075cb528e8-operator-scripts\") pod \"novacell1d49f-account-delete-kt6sb\" (UID: \"c305bbe8-a238-40ff-9b8f-2d075cb528e8\") " pod="openstack/novacell1d49f-account-delete-kt6sb" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.654015 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llr9r\" (UniqueName: \"kubernetes.io/projected/c305bbe8-a238-40ff-9b8f-2d075cb528e8-kube-api-access-llr9r\") pod \"novacell1d49f-account-delete-kt6sb\" (UID: \"c305bbe8-a238-40ff-9b8f-2d075cb528e8\") " pod="openstack/novacell1d49f-account-delete-kt6sb" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.654386 4745 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.657095 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83c2d24-3c0c-4097-afd8-1649e08665e4-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.657522 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2f843182-a85c-47cf-ba16-414c40a031c5" (UID: "2f843182-a85c-47cf-ba16-414c40a031c5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: E1209 11:57:46.679868 4745 projected.go:194] Error preparing data for projected volume kube-api-access-llr9r for pod openstack/novacell1d49f-account-delete-kt6sb: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Dec 09 11:57:46 crc kubenswrapper[4745]: E1209 11:57:46.680038 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c305bbe8-a238-40ff-9b8f-2d075cb528e8-kube-api-access-llr9r podName:c305bbe8-a238-40ff-9b8f-2d075cb528e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:57:48.679986818 +0000 UTC m=+1555.505188342 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-llr9r" (UniqueName: "kubernetes.io/projected/c305bbe8-a238-40ff-9b8f-2d075cb528e8-kube-api-access-llr9r") pod "novacell1d49f-account-delete-kt6sb" (UID: "c305bbe8-a238-40ff-9b8f-2d075cb528e8") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.750046 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.750392 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="0f786d16-8a6e-420b-b2b7-f785386e2191" containerName="nova-cell1-conductor-conductor" containerID="cri-o://709a10ef722e6bd115a3fbe87e5cae6688e5855a823df3b981535f201d001778" gracePeriod=30 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.759646 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.791434 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52964bb1-2d93-4df7-afbc-95f1eb10b8fc" (UID: "52964bb1-2d93-4df7-afbc-95f1eb10b8fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.801618 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zd8rf"] Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.833884 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "dcea58c8-1e21-4581-bcdb-7b1f88e8b463" (UID: "dcea58c8-1e21-4581-bcdb-7b1f88e8b463"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.834077 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="5b860955-30eb-40e6-bd56-caf6098aed8a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.872260 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.875156 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="5d388089-75a9-4e64-8fcf-575fde454708" containerName="nova-cell0-conductor-conductor" containerID="cri-o://5e8c79b90f4d2a11dc79650cd3c734e513694255f605827a249017e68a0d2eae" gracePeriod=30 Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.884416 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2f843182-a85c-47cf-ba16-414c40a031c5" (UID: "2f843182-a85c-47cf-ba16-414c40a031c5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.907595 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qqbmx"] Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.913340 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-config" (OuterVolumeSpecName: "config") pod "2f843182-a85c-47cf-ba16-414c40a031c5" (UID: "2f843182-a85c-47cf-ba16-414c40a031c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.915521 4745 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcea58c8-1e21-4581-bcdb-7b1f88e8b463-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.915649 4745 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.915720 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.915788 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.925980 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2f843182-a85c-47cf-ba16-414c40a031c5" (UID: "2f843182-a85c-47cf-ba16-414c40a031c5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.932323 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 11:57:46 crc kubenswrapper[4745]: I1209 11:57:46.970941 4745 scope.go:117] "RemoveContainer" containerID="6c0e76c55419e28835b5addc2baf871683786704476377eadb9a576f9a3ff72e" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.017106 4745 scope.go:117] "RemoveContainer" containerID="fe72087dc101255fbdaec3e160e931ac8185f38e75fad2f047784e8531b33949" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.017286 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qqbmx"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.020189 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ac238b52-4167-4847-b66f-6985b784268c-openstack-config\") pod \"ac238b52-4167-4847-b66f-6985b784268c\" (UID: \"ac238b52-4167-4847-b66f-6985b784268c\") " Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.020307 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac238b52-4167-4847-b66f-6985b784268c-combined-ca-bundle\") pod \"ac238b52-4167-4847-b66f-6985b784268c\" (UID: \"ac238b52-4167-4847-b66f-6985b784268c\") " Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.020388 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm46b\" (UniqueName: \"kubernetes.io/projected/ac238b52-4167-4847-b66f-6985b784268c-kube-api-access-lm46b\") pod \"ac238b52-4167-4847-b66f-6985b784268c\" (UID: \"ac238b52-4167-4847-b66f-6985b784268c\") " Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.020451 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ac238b52-4167-4847-b66f-6985b784268c-openstack-config-secret\") pod \"ac238b52-4167-4847-b66f-6985b784268c\" (UID: \"ac238b52-4167-4847-b66f-6985b784268c\") " Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.024587 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f843182-a85c-47cf-ba16-414c40a031c5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.033112 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="70a88ab6-2793-4952-b04c-9041a15e83f9" containerName="galera" containerID="cri-o://ebc3184d8235e913b37b06d481dcea6c9617fbc8229add4be26ceee31189956c" gracePeriod=30 Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.049162 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac238b52-4167-4847-b66f-6985b784268c-kube-api-access-lm46b" (OuterVolumeSpecName: "kube-api-access-lm46b") pod "ac238b52-4167-4847-b66f-6985b784268c" (UID: "ac238b52-4167-4847-b66f-6985b784268c"). InnerVolumeSpecName "kube-api-access-lm46b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.074793 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement980e-account-delete-85kkl"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.078041 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "52964bb1-2d93-4df7-afbc-95f1eb10b8fc" (UID: "52964bb1-2d93-4df7-afbc-95f1eb10b8fc"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.097344 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac238b52-4167-4847-b66f-6985b784268c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac238b52-4167-4847-b66f-6985b784268c" (UID: "ac238b52-4167-4847-b66f-6985b784268c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.103417 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac238b52-4167-4847-b66f-6985b784268c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ac238b52-4167-4847-b66f-6985b784268c" (UID: "ac238b52-4167-4847-b66f-6985b784268c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.131479 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac238b52-4167-4847-b66f-6985b784268c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.131542 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm46b\" (UniqueName: \"kubernetes.io/projected/ac238b52-4167-4847-b66f-6985b784268c-kube-api-access-lm46b\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.131557 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52964bb1-2d93-4df7-afbc-95f1eb10b8fc-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.131571 4745 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ac238b52-4167-4847-b66f-6985b784268c-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:47 crc kubenswrapper[4745]: E1209 11:57:47.132037 4745 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 09 11:57:47 crc kubenswrapper[4745]: E1209 11:57:47.132121 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-config-data podName:5b860955-30eb-40e6-bd56-caf6098aed8a nodeName:}" failed. No retries permitted until 2025-12-09 11:57:51.132095298 +0000 UTC m=+1557.957296822 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-config-data") pod "rabbitmq-server-0" (UID: "5b860955-30eb-40e6-bd56-caf6098aed8a") : configmap "rabbitmq-config-data" not found Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.136762 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron1fc4-account-delete-rqj56"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.138442 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8e18fbbd-a304-422e-8c13-88ab08fef424/ovsdbserver-sb/0.log" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.138562 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.190243 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5c99cd79f9-qgnsc"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.194708 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" podUID="56e7dac8-1382-446b-88c7-47104b5a89cf" containerName="proxy-httpd" containerID="cri-o://324e724346a04d819890081d5a91493f6e1feb208d14a31f346c568761a5cbf4" gracePeriod=30 Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.195690 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" podUID="56e7dac8-1382-446b-88c7-47104b5a89cf" containerName="proxy-server" containerID="cri-o://5b3b68afb88cc6f2e0bb3746cc7c964f36672819d43cd68d83781dceeba6aa46" gracePeriod=30 Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.220762 4745 scope.go:117] "RemoveContainer" containerID="543eb1f623aed841beae952f8e46e93a8e7442d03c588d9a15fad9c843e15cc2" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.224938 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder686d-account-delete-s68kh"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.232366 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-metrics-certs-tls-certs\") pod \"8e18fbbd-a304-422e-8c13-88ab08fef424\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.232416 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-combined-ca-bundle\") pod \"8e18fbbd-a304-422e-8c13-88ab08fef424\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.232446 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e18fbbd-a304-422e-8c13-88ab08fef424-config\") pod \"8e18fbbd-a304-422e-8c13-88ab08fef424\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.232557 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e18fbbd-a304-422e-8c13-88ab08fef424-scripts\") pod \"8e18fbbd-a304-422e-8c13-88ab08fef424\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.234924 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"8e18fbbd-a304-422e-8c13-88ab08fef424\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.234985 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-ovsdbserver-sb-tls-certs\") pod \"8e18fbbd-a304-422e-8c13-88ab08fef424\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.235089 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e18fbbd-a304-422e-8c13-88ab08fef424-ovsdb-rundir\") pod \"8e18fbbd-a304-422e-8c13-88ab08fef424\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.235112 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlrjl\" (UniqueName: \"kubernetes.io/projected/8e18fbbd-a304-422e-8c13-88ab08fef424-kube-api-access-nlrjl\") pod \"8e18fbbd-a304-422e-8c13-88ab08fef424\" (UID: \"8e18fbbd-a304-422e-8c13-88ab08fef424\") " Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.236616 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e18fbbd-a304-422e-8c13-88ab08fef424-config" (OuterVolumeSpecName: "config") pod "8e18fbbd-a304-422e-8c13-88ab08fef424" (UID: "8e18fbbd-a304-422e-8c13-88ab08fef424"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.237073 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e18fbbd-a304-422e-8c13-88ab08fef424-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "8e18fbbd-a304-422e-8c13-88ab08fef424" (UID: "8e18fbbd-a304-422e-8c13-88ab08fef424"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.237235 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e18fbbd-a304-422e-8c13-88ab08fef424-scripts" (OuterVolumeSpecName: "scripts") pod "8e18fbbd-a304-422e-8c13-88ab08fef424" (UID: "8e18fbbd-a304-422e-8c13-88ab08fef424"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.251046 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.259035 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e18fbbd-a304-422e-8c13-88ab08fef424-kube-api-access-nlrjl" (OuterVolumeSpecName: "kube-api-access-nlrjl") pod "8e18fbbd-a304-422e-8c13-88ab08fef424" (UID: "8e18fbbd-a304-422e-8c13-88ab08fef424"). InnerVolumeSpecName "kube-api-access-nlrjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.259826 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.275946 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance4957-account-delete-b6ngs"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.297700 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "8e18fbbd-a304-422e-8c13-88ab08fef424" (UID: "8e18fbbd-a304-422e-8c13-88ab08fef424"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.306933 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican83f7-account-delete-5vbd2"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.315352 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac238b52-4167-4847-b66f-6985b784268c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ac238b52-4167-4847-b66f-6985b784268c" (UID: "ac238b52-4167-4847-b66f-6985b784268c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.341753 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e18fbbd-a304-422e-8c13-88ab08fef424-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.341818 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e18fbbd-a304-422e-8c13-88ab08fef424-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.341875 4745 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.341892 4745 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ac238b52-4167-4847-b66f-6985b784268c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.341908 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e18fbbd-a304-422e-8c13-88ab08fef424-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.341921 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlrjl\" (UniqueName: \"kubernetes.io/projected/8e18fbbd-a304-422e-8c13-88ab08fef424-kube-api-access-nlrjl\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.357347 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e18fbbd-a304-422e-8c13-88ab08fef424" (UID: "8e18fbbd-a304-422e-8c13-88ab08fef424"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.357766 4745 generic.go:334] "Generic (PLEG): container finished" podID="c38b2a61-5161-4132-be1a-65e25531e73a" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" exitCode=0 Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.357896 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zdn2w" event={"ID":"c38b2a61-5161-4132-be1a-65e25531e73a","Type":"ContainerDied","Data":"b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.367390 4745 generic.go:334] "Generic (PLEG): container finished" podID="2193006f-4e85-4b55-a6ab-9237f4c9888f" containerID="f966743286a79147a7637b37640d7567a316c8d3352249f62fcd961a1b853aed" exitCode=0 Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.367478 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2193006f-4e85-4b55-a6ab-9237f4c9888f","Type":"ContainerDied","Data":"f966743286a79147a7637b37640d7567a316c8d3352249f62fcd961a1b853aed"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.377170 4745 generic.go:334] "Generic (PLEG): container finished" podID="f069021c-4758-4a29-98a5-2952a693cef9" containerID="3c97425c86c2cd129c5f56303b9b231be4792489fd433e4c00bc829ef51d297c" exitCode=0 Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.377263 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b68fbfd5-bx5sx" event={"ID":"f069021c-4758-4a29-98a5-2952a693cef9","Type":"ContainerDied","Data":"3c97425c86c2cd129c5f56303b9b231be4792489fd433e4c00bc829ef51d297c"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.380708 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder686d-account-delete-s68kh" event={"ID":"8ca899aa-fd10-43e0-ae58-a3a4ae2a4066","Type":"ContainerStarted","Data":"e1cfa3f3cd436608cae8cfea37aad0942ba4e426a339a4ea953d4667b2cbe4ac"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.382754 4745 generic.go:334] "Generic (PLEG): container finished" podID="4658eac8-46b0-448b-8bc7-7c783fcef1c6" containerID="8c3a126676cf77a7477f9d5072236397973a255f712333be799d5fe817503fba" exitCode=143 Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.382885 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" event={"ID":"4658eac8-46b0-448b-8bc7-7c783fcef1c6","Type":"ContainerDied","Data":"8c3a126676cf77a7477f9d5072236397973a255f712333be799d5fe817503fba"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.390838 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron1fc4-account-delete-rqj56" event={"ID":"f8d65ea6-5ea0-44fa-a4ab-82297d975a87","Type":"ContainerStarted","Data":"7258dfd2d9d91a044e051336a0dd28b89acfb4cee3d06bfe62c49ee36a381e57"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.403438 4745 generic.go:334] "Generic (PLEG): container finished" podID="eaef3c48-5e7c-4ea3-a2d0-da44ea528455" containerID="7a0d9d93ae67ba5d6ed408c67d268af4c670890fd02fafc91373ab63fe2182ca" exitCode=0 Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.403552 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"eaef3c48-5e7c-4ea3-a2d0-da44ea528455","Type":"ContainerDied","Data":"7a0d9d93ae67ba5d6ed408c67d268af4c670890fd02fafc91373ab63fe2182ca"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.410253 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi03c2-account-delete-4b54q"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.413273 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4957-account-delete-b6ngs" event={"ID":"1b87df09-ea26-4c97-bcd6-4ee7c6250d00","Type":"ContainerStarted","Data":"dce8538aa9459efaa7837ae30be0b9745ee689a38c57c84e12eb58700786d6b5"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.427941 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.436430 4745 generic.go:334] "Generic (PLEG): container finished" podID="73ed8df4-e28f-4c76-bca1-a3d77ef789d4" containerID="892538c48ca05829a077bcf325c90ee4bb55781acc43871ab0ffe0358d7af1b9" exitCode=143 Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.436588 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73ed8df4-e28f-4c76-bca1-a3d77ef789d4","Type":"ContainerDied","Data":"892538c48ca05829a077bcf325c90ee4bb55781acc43871ab0ffe0358d7af1b9"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.448219 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.449466 4745 generic.go:334] "Generic (PLEG): container finished" podID="48a868ce-c1ab-457a-bd7f-224f8e982a13" containerID="3cb8e987a7a6b5a17dc3a1ce4421d7a97a43c5bcf0226be502cadc9674ba1fdf" exitCode=143 Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.449606 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"48a868ce-c1ab-457a-bd7f-224f8e982a13","Type":"ContainerDied","Data":"3cb8e987a7a6b5a17dc3a1ce4421d7a97a43c5bcf0226be502cadc9674ba1fdf"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.470473 4745 generic.go:334] "Generic (PLEG): container finished" podID="c180eac0-e93f-4067-ba6d-32a023f424e6" containerID="54a2edbbc1397cc5ce4a40d2a57015e139f31d6f9f4b30325e3a940d377baabd" exitCode=143 Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.470612 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c180eac0-e93f-4067-ba6d-32a023f424e6","Type":"ContainerDied","Data":"54a2edbbc1397cc5ce4a40d2a57015e139f31d6f9f4b30325e3a940d377baabd"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.493757 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement980e-account-delete-85kkl" event={"ID":"058e6f79-b92b-47d3-97ae-d588fec5efcd","Type":"ContainerStarted","Data":"29b4a184917de6914515f00cc21f359e17c2106e2966f12c01a46b5cfb3327b7"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.493828 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement980e-account-delete-85kkl" event={"ID":"058e6f79-b92b-47d3-97ae-d588fec5efcd","Type":"ContainerStarted","Data":"7ac766ca90f7a3dc12679f41b40e852c40e34f24b5ca71f1800c4c3e99d9cd84"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.501816 4745 generic.go:334] "Generic (PLEG): container finished" podID="600c553e-f8e5-4ec6-94e7-2981abc748cb" containerID="76e07b4d01b40ce6fe2153747a2637a847f6695c48e7781c037a25626d453609" exitCode=143 Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.501914 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57876487f8-zgj8m" event={"ID":"600c553e-f8e5-4ec6-94e7-2981abc748cb","Type":"ContainerDied","Data":"76e07b4d01b40ce6fe2153747a2637a847f6695c48e7781c037a25626d453609"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.506157 4745 generic.go:334] "Generic (PLEG): container finished" podID="da94483d-f361-42ef-95b4-d4b2c79b4d80" containerID="247fcd5449ca5281d6d3e9c9cde028019450ce328caca6d4d39e7b414171ced2" exitCode=143 Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.506276 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cf679bcc-g65zr" event={"ID":"da94483d-f361-42ef-95b4-d4b2c79b4d80","Type":"ContainerDied","Data":"247fcd5449ca5281d6d3e9c9cde028019450ce328caca6d4d39e7b414171ced2"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.522432 4745 generic.go:334] "Generic (PLEG): container finished" podID="c9e009b8-8eec-4028-ade9-84bc49d236c8" containerID="0959a307a96a3c8b986b8df7cc54318c82c7451c29bafc311bb4eafa178c8154" exitCode=143 Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.522592 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9e009b8-8eec-4028-ade9-84bc49d236c8","Type":"ContainerDied","Data":"0959a307a96a3c8b986b8df7cc54318c82c7451c29bafc311bb4eafa178c8154"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.528131 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8e18fbbd-a304-422e-8c13-88ab08fef424/ovsdbserver-sb/0.log" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.528248 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1d49f-account-delete-kt6sb" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.529075 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.531578 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8e18fbbd-a304-422e-8c13-88ab08fef424","Type":"ContainerDied","Data":"13acead9b2f70206b0d4d9acc4b65b11ce773376a33a579d14ee567690763687"} Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.542866 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement980e-account-delete-85kkl" podStartSLOduration=4.542830983 podStartE2EDuration="4.542830983s" podCreationTimestamp="2025-12-09 11:57:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:57:47.519405041 +0000 UTC m=+1554.344606575" watchObservedRunningTime="2025-12-09 11:57:47.542830983 +0000 UTC m=+1554.368032507" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.545616 4745 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.554485 4745 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.620783 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43837717-2618-45c9-afac-172826694ae5" path="/var/lib/kubelet/pods/43837717-2618-45c9-afac-172826694ae5/volumes" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.622270 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8df4272b-8d98-4788-8513-1f3ab014775c" path="/var/lib/kubelet/pods/8df4272b-8d98-4788-8513-1f3ab014775c/volumes" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.623302 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac238b52-4167-4847-b66f-6985b784268c" path="/var/lib/kubelet/pods/ac238b52-4167-4847-b66f-6985b784268c/volumes" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.625965 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae7f8892-3148-4488-bfa2-afe44917e31d" path="/var/lib/kubelet/pods/ae7f8892-3148-4488-bfa2-afe44917e31d/volumes" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.627714 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d83c2d24-3c0c-4097-afd8-1649e08665e4" path="/var/lib/kubelet/pods/d83c2d24-3c0c-4097-afd8-1649e08665e4/volumes" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.629380 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea1ee906-8314-4be0-9845-f8abfe129175" path="/var/lib/kubelet/pods/ea1ee906-8314-4be0-9845-f8abfe129175/volumes" Dec 09 11:57:47 crc kubenswrapper[4745]: W1209 11:57:47.637488 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode822e9c0_d6fa_4880_a0e3_8dfb32405a6f.slice/crio-0565c02dccb4ed507911cb3e1400e46707346fe6909e6a5ffcde4b1ce1ff7197 WatchSource:0}: Error finding container 0565c02dccb4ed507911cb3e1400e46707346fe6909e6a5ffcde4b1ce1ff7197: Status 404 returned error can't find the container with id 0565c02dccb4ed507911cb3e1400e46707346fe6909e6a5ffcde4b1ce1ff7197 Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.664727 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "8e18fbbd-a304-422e-8c13-88ab08fef424" (UID: "8e18fbbd-a304-422e-8c13-88ab08fef424"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.688428 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "8e18fbbd-a304-422e-8c13-88ab08fef424" (UID: "8e18fbbd-a304-422e-8c13-88ab08fef424"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.743173 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0e536-account-delete-zsxfl"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.749828 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-867cd545c7-gxt6p"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.759334 4745 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.759371 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18fbbd-a304-422e-8c13-88ab08fef424-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.784951 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-867cd545c7-gxt6p"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.787141 4745 scope.go:117] "RemoveContainer" containerID="97ec8fabc6b86620361830e04b72f36435b3629bd52098c9979c36dcf3397fc0" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.851398 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1d49f-account-delete-kt6sb" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.863107 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dqr9b"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.870855 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dqr9b"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.874419 4745 scope.go:117] "RemoveContainer" containerID="543eb1f623aed841beae952f8e46e93a8e7442d03c588d9a15fad9c843e15cc2" Dec 09 11:57:47 crc kubenswrapper[4745]: E1209 11:57:47.874871 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"543eb1f623aed841beae952f8e46e93a8e7442d03c588d9a15fad9c843e15cc2\": container with ID starting with 543eb1f623aed841beae952f8e46e93a8e7442d03c588d9a15fad9c843e15cc2 not found: ID does not exist" containerID="543eb1f623aed841beae952f8e46e93a8e7442d03c588d9a15fad9c843e15cc2" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.874910 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543eb1f623aed841beae952f8e46e93a8e7442d03c588d9a15fad9c843e15cc2"} err="failed to get container status \"543eb1f623aed841beae952f8e46e93a8e7442d03c588d9a15fad9c843e15cc2\": rpc error: code = NotFound desc = could not find container \"543eb1f623aed841beae952f8e46e93a8e7442d03c588d9a15fad9c843e15cc2\": container with ID starting with 543eb1f623aed841beae952f8e46e93a8e7442d03c588d9a15fad9c843e15cc2 not found: ID does not exist" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.874935 4745 scope.go:117] "RemoveContainer" containerID="97ec8fabc6b86620361830e04b72f36435b3629bd52098c9979c36dcf3397fc0" Dec 09 11:57:47 crc kubenswrapper[4745]: E1209 11:57:47.875480 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97ec8fabc6b86620361830e04b72f36435b3629bd52098c9979c36dcf3397fc0\": container with ID starting with 97ec8fabc6b86620361830e04b72f36435b3629bd52098c9979c36dcf3397fc0 not found: ID does not exist" containerID="97ec8fabc6b86620361830e04b72f36435b3629bd52098c9979c36dcf3397fc0" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.875526 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97ec8fabc6b86620361830e04b72f36435b3629bd52098c9979c36dcf3397fc0"} err="failed to get container status \"97ec8fabc6b86620361830e04b72f36435b3629bd52098c9979c36dcf3397fc0\": rpc error: code = NotFound desc = could not find container \"97ec8fabc6b86620361830e04b72f36435b3629bd52098c9979c36dcf3397fc0\": container with ID starting with 97ec8fabc6b86620361830e04b72f36435b3629bd52098c9979c36dcf3397fc0 not found: ID does not exist" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.875542 4745 scope.go:117] "RemoveContainer" containerID="dc6f10acb25d86858fe8f0ee4fbcf99c84e675bea1bcdd1d53f89781a1482c42" Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.899552 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-dw6lx"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.920067 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-dw6lx"] Dec 09 11:57:47 crc kubenswrapper[4745]: I1209 11:57:47.971005 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.069640 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nlmw\" (UniqueName: \"kubernetes.io/projected/2193006f-4e85-4b55-a6ab-9237f4c9888f-kube-api-access-4nlmw\") pod \"2193006f-4e85-4b55-a6ab-9237f4c9888f\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.070498 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-combined-ca-bundle\") pod \"2193006f-4e85-4b55-a6ab-9237f4c9888f\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.070649 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-nova-novncproxy-tls-certs\") pod \"2193006f-4e85-4b55-a6ab-9237f4c9888f\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.070716 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-vencrypt-tls-certs\") pod \"2193006f-4e85-4b55-a6ab-9237f4c9888f\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.071180 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-config-data\") pod \"2193006f-4e85-4b55-a6ab-9237f4c9888f\" (UID: \"2193006f-4e85-4b55-a6ab-9237f4c9888f\") " Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.089989 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2193006f-4e85-4b55-a6ab-9237f4c9888f-kube-api-access-4nlmw" (OuterVolumeSpecName: "kube-api-access-4nlmw") pod "2193006f-4e85-4b55-a6ab-9237f4c9888f" (UID: "2193006f-4e85-4b55-a6ab-9237f4c9888f"). InnerVolumeSpecName "kube-api-access-4nlmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.145242 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2193006f-4e85-4b55-a6ab-9237f4c9888f" (UID: "2193006f-4e85-4b55-a6ab-9237f4c9888f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.177617 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nlmw\" (UniqueName: \"kubernetes.io/projected/2193006f-4e85-4b55-a6ab-9237f4c9888f-kube-api-access-4nlmw\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.177921 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.204205 4745 scope.go:117] "RemoveContainer" containerID="f5ff7ad1034757488e9fa6ef3e4534c7b33ad6d4824c61d216c9946583e12257" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.210432 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.229716 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.231035 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-config-data" (OuterVolumeSpecName: "config-data") pod "2193006f-4e85-4b55-a6ab-9237f4c9888f" (UID: "2193006f-4e85-4b55-a6ab-9237f4c9888f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.238797 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "2193006f-4e85-4b55-a6ab-9237f4c9888f" (UID: "2193006f-4e85-4b55-a6ab-9237f4c9888f"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.283723 4745 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.283757 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:48 crc kubenswrapper[4745]: E1209 11:57:48.287571 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4362cdcbe28cc87f9a30f3b6d968ff0b65e8991acf95fb8eeec6d303cfb21ea1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:57:48 crc kubenswrapper[4745]: E1209 11:57:48.293649 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4362cdcbe28cc87f9a30f3b6d968ff0b65e8991acf95fb8eeec6d303cfb21ea1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:57:48 crc kubenswrapper[4745]: E1209 11:57:48.321098 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4362cdcbe28cc87f9a30f3b6d968ff0b65e8991acf95fb8eeec6d303cfb21ea1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:57:48 crc kubenswrapper[4745]: E1209 11:57:48.321199 4745 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="392b878a-37ba-4887-a699-672c8b92e947" containerName="nova-scheduler-scheduler" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.357214 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "2193006f-4e85-4b55-a6ab-9237f4c9888f" (UID: "2193006f-4e85-4b55-a6ab-9237f4c9888f"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:48 crc kubenswrapper[4745]: E1209 11:57:48.369980 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="709a10ef722e6bd115a3fbe87e5cae6688e5855a823df3b981535f201d001778" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 11:57:48 crc kubenswrapper[4745]: E1209 11:57:48.373318 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="709a10ef722e6bd115a3fbe87e5cae6688e5855a823df3b981535f201d001778" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 11:57:48 crc kubenswrapper[4745]: E1209 11:57:48.374870 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="709a10ef722e6bd115a3fbe87e5cae6688e5855a823df3b981535f201d001778" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 11:57:48 crc kubenswrapper[4745]: E1209 11:57:48.374911 4745 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="0f786d16-8a6e-420b-b2b7-f785386e2191" containerName="nova-cell1-conductor-conductor" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.392247 4745 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2193006f-4e85-4b55-a6ab-9237f4c9888f-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.557099 4745 scope.go:117] "RemoveContainer" containerID="40aa5562bd3f9aae5ac913d05ca874ce263e3c959881973ff50135ec376a86f0" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.562797 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4957-account-delete-b6ngs" event={"ID":"1b87df09-ea26-4c97-bcd6-4ee7c6250d00","Type":"ContainerStarted","Data":"292c118e9ea99e79f59b1386147eb09c4d2872305f317b469d80803ac95abb15"} Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.579270 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi03c2-account-delete-4b54q" event={"ID":"0da87c5d-f709-4d9b-b182-421edbb61f00","Type":"ContainerStarted","Data":"c5fb98f1a64a8c7231b9ecbecc3a768da83bc7e9f003e31f3af3d1abc4bc52b1"} Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.597226 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance4957-account-delete-b6ngs" podStartSLOduration=5.597193992 podStartE2EDuration="5.597193992s" podCreationTimestamp="2025-12-09 11:57:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:57:48.584990103 +0000 UTC m=+1555.410191637" watchObservedRunningTime="2025-12-09 11:57:48.597193992 +0000 UTC m=+1555.422395526" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.607559 4745 generic.go:334] "Generic (PLEG): container finished" podID="f8d65ea6-5ea0-44fa-a4ab-82297d975a87" containerID="e0758a906f3658a8bea2c9761871942f25867001c078826c156c644ccceb845e" exitCode=0 Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.607685 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron1fc4-account-delete-rqj56" event={"ID":"f8d65ea6-5ea0-44fa-a4ab-82297d975a87","Type":"ContainerDied","Data":"e0758a906f3658a8bea2c9761871942f25867001c078826c156c644ccceb845e"} Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.614029 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2193006f-4e85-4b55-a6ab-9237f4c9888f","Type":"ContainerDied","Data":"e11f875f937a177291e0bd6654f176ad55033b644b0ff0a3f2dc38fa91d224f0"} Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.614164 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.623766 4745 generic.go:334] "Generic (PLEG): container finished" podID="058e6f79-b92b-47d3-97ae-d588fec5efcd" containerID="29b4a184917de6914515f00cc21f359e17c2106e2966f12c01a46b5cfb3327b7" exitCode=0 Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.623920 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement980e-account-delete-85kkl" event={"ID":"058e6f79-b92b-47d3-97ae-d588fec5efcd","Type":"ContainerDied","Data":"29b4a184917de6914515f00cc21f359e17c2106e2966f12c01a46b5cfb3327b7"} Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.638817 4745 generic.go:334] "Generic (PLEG): container finished" podID="8ca899aa-fd10-43e0-ae58-a3a4ae2a4066" containerID="feb47bcafcf25de8587ef4bb55ae093b5fe21d5f5e5449021a1e056cb1aec196" exitCode=0 Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.638916 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder686d-account-delete-s68kh" event={"ID":"8ca899aa-fd10-43e0-ae58-a3a4ae2a4066","Type":"ContainerDied","Data":"feb47bcafcf25de8587ef4bb55ae093b5fe21d5f5e5449021a1e056cb1aec196"} Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.639969 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.676723 4745 generic.go:334] "Generic (PLEG): container finished" podID="70a88ab6-2793-4952-b04c-9041a15e83f9" containerID="ebc3184d8235e913b37b06d481dcea6c9617fbc8229add4be26ceee31189956c" exitCode=0 Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.676808 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"70a88ab6-2793-4952-b04c-9041a15e83f9","Type":"ContainerDied","Data":"ebc3184d8235e913b37b06d481dcea6c9617fbc8229add4be26ceee31189956c"} Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.676838 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"70a88ab6-2793-4952-b04c-9041a15e83f9","Type":"ContainerDied","Data":"aaa8b3251637529b75cae7d64799a3625bb55130e930c3ed9d03811da913249e"} Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.677127 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.687483 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0e536-account-delete-zsxfl" event={"ID":"e822e9c0-d6fa-4880-a0e3-8dfb32405a6f","Type":"ContainerStarted","Data":"0565c02dccb4ed507911cb3e1400e46707346fe6909e6a5ffcde4b1ce1ff7197"} Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.704024 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.717323 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.717779 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-kolla-config\") pod \"70a88ab6-2793-4952-b04c-9041a15e83f9\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.717904 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/70a88ab6-2793-4952-b04c-9041a15e83f9-galera-tls-certs\") pod \"70a88ab6-2793-4952-b04c-9041a15e83f9\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.718284 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-config-data-default\") pod \"70a88ab6-2793-4952-b04c-9041a15e83f9\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.718367 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "70a88ab6-2793-4952-b04c-9041a15e83f9" (UID: "70a88ab6-2793-4952-b04c-9041a15e83f9"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.718459 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/70a88ab6-2793-4952-b04c-9041a15e83f9-config-data-generated\") pod \"70a88ab6-2793-4952-b04c-9041a15e83f9\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.718647 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-operator-scripts\") pod \"70a88ab6-2793-4952-b04c-9041a15e83f9\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.722144 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"70a88ab6-2793-4952-b04c-9041a15e83f9\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.722226 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a88ab6-2793-4952-b04c-9041a15e83f9-combined-ca-bundle\") pod \"70a88ab6-2793-4952-b04c-9041a15e83f9\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.722320 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdtsx\" (UniqueName: \"kubernetes.io/projected/70a88ab6-2793-4952-b04c-9041a15e83f9-kube-api-access-fdtsx\") pod \"70a88ab6-2793-4952-b04c-9041a15e83f9\" (UID: \"70a88ab6-2793-4952-b04c-9041a15e83f9\") " Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.720681 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70a88ab6-2793-4952-b04c-9041a15e83f9-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "70a88ab6-2793-4952-b04c-9041a15e83f9" (UID: "70a88ab6-2793-4952-b04c-9041a15e83f9"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.724354 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "70a88ab6-2793-4952-b04c-9041a15e83f9" (UID: "70a88ab6-2793-4952-b04c-9041a15e83f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.726406 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "70a88ab6-2793-4952-b04c-9041a15e83f9" (UID: "70a88ab6-2793-4952-b04c-9041a15e83f9"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.726416 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llr9r\" (UniqueName: \"kubernetes.io/projected/c305bbe8-a238-40ff-9b8f-2d075cb528e8-kube-api-access-llr9r\") pod \"novacell1d49f-account-delete-kt6sb\" (UID: \"c305bbe8-a238-40ff-9b8f-2d075cb528e8\") " pod="openstack/novacell1d49f-account-delete-kt6sb" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.726929 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c305bbe8-a238-40ff-9b8f-2d075cb528e8-operator-scripts\") pod \"novacell1d49f-account-delete-kt6sb\" (UID: \"c305bbe8-a238-40ff-9b8f-2d075cb528e8\") " pod="openstack/novacell1d49f-account-delete-kt6sb" Dec 09 11:57:48 crc kubenswrapper[4745]: E1209 11:57:48.727176 4745 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.727225 4745 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/70a88ab6-2793-4952-b04c-9041a15e83f9-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:48 crc kubenswrapper[4745]: E1209 11:57:48.727241 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c305bbe8-a238-40ff-9b8f-2d075cb528e8-operator-scripts podName:c305bbe8-a238-40ff-9b8f-2d075cb528e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:57:52.727219028 +0000 UTC m=+1559.552420552 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c305bbe8-a238-40ff-9b8f-2d075cb528e8-operator-scripts") pod "novacell1d49f-account-delete-kt6sb" (UID: "c305bbe8-a238-40ff-9b8f-2d075cb528e8") : configmap "openstack-cell1-scripts" not found Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.727265 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.727280 4745 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.727291 4745 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/70a88ab6-2793-4952-b04c-9041a15e83f9-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.732175 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican83f7-account-delete-5vbd2" event={"ID":"88802adc-d164-420b-98d2-a757b6627350","Type":"ContainerStarted","Data":"c4860343d4ee81a640779864c9d8d1bbc93d0c65395dd5b02fec146d0aee40be"} Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.732234 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican83f7-account-delete-5vbd2" event={"ID":"88802adc-d164-420b-98d2-a757b6627350","Type":"ContainerStarted","Data":"3988406d5ae9fe980ddde74bc971cdf6095b3566ba793301f7161d913834d4be"} Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.746123 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70a88ab6-2793-4952-b04c-9041a15e83f9-kube-api-access-fdtsx" (OuterVolumeSpecName: "kube-api-access-fdtsx") pod "70a88ab6-2793-4952-b04c-9041a15e83f9" (UID: "70a88ab6-2793-4952-b04c-9041a15e83f9"). InnerVolumeSpecName "kube-api-access-fdtsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:48 crc kubenswrapper[4745]: E1209 11:57:48.754622 4745 projected.go:194] Error preparing data for projected volume kube-api-access-llr9r for pod openstack/novacell1d49f-account-delete-kt6sb: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Dec 09 11:57:48 crc kubenswrapper[4745]: E1209 11:57:48.754718 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c305bbe8-a238-40ff-9b8f-2d075cb528e8-kube-api-access-llr9r podName:c305bbe8-a238-40ff-9b8f-2d075cb528e8 nodeName:}" failed. No retries permitted until 2025-12-09 11:57:52.754694229 +0000 UTC m=+1559.579895753 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-llr9r" (UniqueName: "kubernetes.io/projected/c305bbe8-a238-40ff-9b8f-2d075cb528e8-kube-api-access-llr9r") pod "novacell1d49f-account-delete-kt6sb" (UID: "c305bbe8-a238-40ff-9b8f-2d075cb528e8") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.782293 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70a88ab6-2793-4952-b04c-9041a15e83f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70a88ab6-2793-4952-b04c-9041a15e83f9" (UID: "70a88ab6-2793-4952-b04c-9041a15e83f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.782496 4745 scope.go:117] "RemoveContainer" containerID="f966743286a79147a7637b37640d7567a316c8d3352249f62fcd961a1b853aed" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.795662 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "70a88ab6-2793-4952-b04c-9041a15e83f9" (UID: "70a88ab6-2793-4952-b04c-9041a15e83f9"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.807072 4745 generic.go:334] "Generic (PLEG): container finished" podID="56e7dac8-1382-446b-88c7-47104b5a89cf" containerID="5b3b68afb88cc6f2e0bb3746cc7c964f36672819d43cd68d83781dceeba6aa46" exitCode=0 Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.807122 4745 generic.go:334] "Generic (PLEG): container finished" podID="56e7dac8-1382-446b-88c7-47104b5a89cf" containerID="324e724346a04d819890081d5a91493f6e1feb208d14a31f346c568761a5cbf4" exitCode=0 Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.807229 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1d49f-account-delete-kt6sb" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.807983 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" event={"ID":"56e7dac8-1382-446b-88c7-47104b5a89cf","Type":"ContainerDied","Data":"5b3b68afb88cc6f2e0bb3746cc7c964f36672819d43cd68d83781dceeba6aa46"} Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.808044 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" event={"ID":"56e7dac8-1382-446b-88c7-47104b5a89cf","Type":"ContainerDied","Data":"324e724346a04d819890081d5a91493f6e1feb208d14a31f346c568761a5cbf4"} Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.817270 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican83f7-account-delete-5vbd2" podStartSLOduration=5.817236055 podStartE2EDuration="5.817236055s" podCreationTimestamp="2025-12-09 11:57:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 11:57:48.807440081 +0000 UTC m=+1555.632641605" watchObservedRunningTime="2025-12-09 11:57:48.817236055 +0000 UTC m=+1555.642437579" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.840311 4745 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.840474 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a88ab6-2793-4952-b04c-9041a15e83f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.840493 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdtsx\" (UniqueName: \"kubernetes.io/projected/70a88ab6-2793-4952-b04c-9041a15e83f9-kube-api-access-fdtsx\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.874897 4745 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 09 11:57:48 crc kubenswrapper[4745]: I1209 11:57:48.954711 4745 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.079604 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1d49f-account-delete-kt6sb"] Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.149833 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell1d49f-account-delete-kt6sb"] Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.164771 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70a88ab6-2793-4952-b04c-9041a15e83f9-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "70a88ab6-2793-4952-b04c-9041a15e83f9" (UID: "70a88ab6-2793-4952-b04c-9041a15e83f9"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.168370 4745 scope.go:117] "RemoveContainer" containerID="ebc3184d8235e913b37b06d481dcea6c9617fbc8229add4be26ceee31189956c" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.262956 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llr9r\" (UniqueName: \"kubernetes.io/projected/c305bbe8-a238-40ff-9b8f-2d075cb528e8-kube-api-access-llr9r\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.263217 4745 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/70a88ab6-2793-4952-b04c-9041a15e83f9-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.263227 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c305bbe8-a238-40ff-9b8f-2d075cb528e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.367222 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.377030 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="0452bd55-b0a3-46a9-a388-6db2e40f4cb7" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.179:9292/healthcheck\": read tcp 10.217.0.2:40236->10.217.0.179:9292: read: connection reset by peer" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.377461 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="0452bd55-b0a3-46a9-a388-6db2e40f4cb7" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.179:9292/healthcheck\": read tcp 10.217.0.2:40248->10.217.0.179:9292: read: connection reset by peer" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.391401 4745 scope.go:117] "RemoveContainer" containerID="f2eada94a0c5b273d35a92a2faa4601639f853c423ecc3e3abc9140ff66f94e3" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.398092 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.402389 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.413769 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.420942 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="48a868ce-c1ab-457a-bd7f-224f8e982a13" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.174:8776/healthcheck\": read tcp 10.217.0.2:53770->10.217.0.174:8776: read: connection reset by peer" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.422711 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c180eac0-e93f-4067-ba6d-32a023f424e6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": read tcp 10.217.0.2:44870->10.217.0.202:8775: read: connection reset by peer" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.422863 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c180eac0-e93f-4067-ba6d-32a023f424e6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": read tcp 10.217.0.2:44882->10.217.0.202:8775: read: connection reset by peer" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.479722 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56e7dac8-1382-446b-88c7-47104b5a89cf-log-httpd\") pod \"56e7dac8-1382-446b-88c7-47104b5a89cf\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.479782 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25eecd8f-8b17-4e74-b651-78948c627127-logs\") pod \"25eecd8f-8b17-4e74-b651-78948c627127\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.479886 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-internal-tls-certs\") pod \"25eecd8f-8b17-4e74-b651-78948c627127\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.479916 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-config-data\") pod \"56e7dac8-1382-446b-88c7-47104b5a89cf\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.479957 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-internal-tls-certs\") pod \"56e7dac8-1382-446b-88c7-47104b5a89cf\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.480003 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-config-data\") pod \"25eecd8f-8b17-4e74-b651-78948c627127\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.480046 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4btj2\" (UniqueName: \"kubernetes.io/projected/25eecd8f-8b17-4e74-b651-78948c627127-kube-api-access-4btj2\") pod \"25eecd8f-8b17-4e74-b651-78948c627127\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.480097 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-public-tls-certs\") pod \"56e7dac8-1382-446b-88c7-47104b5a89cf\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.480130 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56e7dac8-1382-446b-88c7-47104b5a89cf-run-httpd\") pod \"56e7dac8-1382-446b-88c7-47104b5a89cf\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.480179 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-combined-ca-bundle\") pod \"25eecd8f-8b17-4e74-b651-78948c627127\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.480224 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-scripts\") pod \"25eecd8f-8b17-4e74-b651-78948c627127\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.480249 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-combined-ca-bundle\") pod \"56e7dac8-1382-446b-88c7-47104b5a89cf\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.480275 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/56e7dac8-1382-446b-88c7-47104b5a89cf-etc-swift\") pod \"56e7dac8-1382-446b-88c7-47104b5a89cf\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.480307 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-public-tls-certs\") pod \"25eecd8f-8b17-4e74-b651-78948c627127\" (UID: \"25eecd8f-8b17-4e74-b651-78948c627127\") " Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.480326 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf8gr\" (UniqueName: \"kubernetes.io/projected/56e7dac8-1382-446b-88c7-47104b5a89cf-kube-api-access-mf8gr\") pod \"56e7dac8-1382-446b-88c7-47104b5a89cf\" (UID: \"56e7dac8-1382-446b-88c7-47104b5a89cf\") " Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.486559 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-scripts" (OuterVolumeSpecName: "scripts") pod "25eecd8f-8b17-4e74-b651-78948c627127" (UID: "25eecd8f-8b17-4e74-b651-78948c627127"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.490080 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56e7dac8-1382-446b-88c7-47104b5a89cf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "56e7dac8-1382-446b-88c7-47104b5a89cf" (UID: "56e7dac8-1382-446b-88c7-47104b5a89cf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.490381 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25eecd8f-8b17-4e74-b651-78948c627127-logs" (OuterVolumeSpecName: "logs") pod "25eecd8f-8b17-4e74-b651-78948c627127" (UID: "25eecd8f-8b17-4e74-b651-78948c627127"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.491497 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56e7dac8-1382-446b-88c7-47104b5a89cf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "56e7dac8-1382-446b-88c7-47104b5a89cf" (UID: "56e7dac8-1382-446b-88c7-47104b5a89cf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.494881 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25eecd8f-8b17-4e74-b651-78948c627127-kube-api-access-4btj2" (OuterVolumeSpecName: "kube-api-access-4btj2") pod "25eecd8f-8b17-4e74-b651-78948c627127" (UID: "25eecd8f-8b17-4e74-b651-78948c627127"). InnerVolumeSpecName "kube-api-access-4btj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.505258 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56e7dac8-1382-446b-88c7-47104b5a89cf-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "56e7dac8-1382-446b-88c7-47104b5a89cf" (UID: "56e7dac8-1382-446b-88c7-47104b5a89cf"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.523112 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56e7dac8-1382-446b-88c7-47104b5a89cf-kube-api-access-mf8gr" (OuterVolumeSpecName: "kube-api-access-mf8gr") pod "56e7dac8-1382-446b-88c7-47104b5a89cf" (UID: "56e7dac8-1382-446b-88c7-47104b5a89cf"). InnerVolumeSpecName "kube-api-access-mf8gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.582280 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2193006f-4e85-4b55-a6ab-9237f4c9888f" path="/var/lib/kubelet/pods/2193006f-4e85-4b55-a6ab-9237f4c9888f/volumes" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.583838 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f843182-a85c-47cf-ba16-414c40a031c5" path="/var/lib/kubelet/pods/2f843182-a85c-47cf-ba16-414c40a031c5/volumes" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.584089 4745 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56e7dac8-1382-446b-88c7-47104b5a89cf-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.584125 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.584135 4745 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/56e7dac8-1382-446b-88c7-47104b5a89cf-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.584148 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf8gr\" (UniqueName: \"kubernetes.io/projected/56e7dac8-1382-446b-88c7-47104b5a89cf-kube-api-access-mf8gr\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.584157 4745 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56e7dac8-1382-446b-88c7-47104b5a89cf-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.584166 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25eecd8f-8b17-4e74-b651-78948c627127-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.584175 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4btj2\" (UniqueName: \"kubernetes.io/projected/25eecd8f-8b17-4e74-b651-78948c627127-kube-api-access-4btj2\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.588739 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52964bb1-2d93-4df7-afbc-95f1eb10b8fc" path="/var/lib/kubelet/pods/52964bb1-2d93-4df7-afbc-95f1eb10b8fc/volumes" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.594248 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70a88ab6-2793-4952-b04c-9041a15e83f9" path="/var/lib/kubelet/pods/70a88ab6-2793-4952-b04c-9041a15e83f9/volumes" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.603148 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e18fbbd-a304-422e-8c13-88ab08fef424" path="/var/lib/kubelet/pods/8e18fbbd-a304-422e-8c13-88ab08fef424/volumes" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.603907 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c305bbe8-a238-40ff-9b8f-2d075cb528e8" path="/var/lib/kubelet/pods/c305bbe8-a238-40ff-9b8f-2d075cb528e8/volumes" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.622990 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcea58c8-1e21-4581-bcdb-7b1f88e8b463" path="/var/lib/kubelet/pods/dcea58c8-1e21-4581-bcdb-7b1f88e8b463/volumes" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.720714 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "56e7dac8-1382-446b-88c7-47104b5a89cf" (UID: "56e7dac8-1382-446b-88c7-47104b5a89cf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.738624 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56e7dac8-1382-446b-88c7-47104b5a89cf" (UID: "56e7dac8-1382-446b-88c7-47104b5a89cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.755338 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25eecd8f-8b17-4e74-b651-78948c627127" (UID: "25eecd8f-8b17-4e74-b651-78948c627127"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.789050 4745 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.789109 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.789119 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.794922 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-config-data" (OuterVolumeSpecName: "config-data") pod "25eecd8f-8b17-4e74-b651-78948c627127" (UID: "25eecd8f-8b17-4e74-b651-78948c627127"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.825767 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "56e7dac8-1382-446b-88c7-47104b5a89cf" (UID: "56e7dac8-1382-446b-88c7-47104b5a89cf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.857677 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-config-data" (OuterVolumeSpecName: "config-data") pod "56e7dac8-1382-446b-88c7-47104b5a89cf" (UID: "56e7dac8-1382-446b-88c7-47104b5a89cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.874546 4745 generic.go:334] "Generic (PLEG): container finished" podID="88802adc-d164-420b-98d2-a757b6627350" containerID="c4860343d4ee81a640779864c9d8d1bbc93d0c65395dd5b02fec146d0aee40be" exitCode=0 Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.881866 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "25eecd8f-8b17-4e74-b651-78948c627127" (UID: "25eecd8f-8b17-4e74-b651-78948c627127"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.900303 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.900338 4745 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56e7dac8-1382-446b-88c7-47104b5a89cf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.900348 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.900360 4745 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.904083 4745 generic.go:334] "Generic (PLEG): container finished" podID="600c553e-f8e5-4ec6-94e7-2981abc748cb" containerID="9b2f931241d0299ec3b05ca17d1893191def91b2abcc6ab8e43b6350e1923813" exitCode=0 Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.907396 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.930417 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "25eecd8f-8b17-4e74-b651-78948c627127" (UID: "25eecd8f-8b17-4e74-b651-78948c627127"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.946062 4745 generic.go:334] "Generic (PLEG): container finished" podID="73ed8df4-e28f-4c76-bca1-a3d77ef789d4" containerID="a9d2e5091d7aefc5b0d929913437d6ded33f50f234ba6c80915715daf68d74db" exitCode=0 Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.956848 4745 generic.go:334] "Generic (PLEG): container finished" podID="e822e9c0-d6fa-4880-a0e3-8dfb32405a6f" containerID="d86765cb79f89a9cbc7de973c711eb1d6f3dd9f41b2b18a025ac97233e7bc3a7" exitCode=0 Dec 09 11:57:49 crc kubenswrapper[4745]: I1209 11:57:49.990560 4745 generic.go:334] "Generic (PLEG): container finished" podID="48a868ce-c1ab-457a-bd7f-224f8e982a13" containerID="84d0dbf76c2ff2ed4c9dc4cbbf8300c430928bdea45eabd90653242d68969b38" exitCode=0 Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.005821 4745 generic.go:334] "Generic (PLEG): container finished" podID="0452bd55-b0a3-46a9-a388-6db2e40f4cb7" containerID="6d7b68aa247edb856bd53e3fca4235d3708496404ba738f128d0babd51b390df" exitCode=0 Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.006886 4745 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25eecd8f-8b17-4e74-b651-78948c627127-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.009573 4745 generic.go:334] "Generic (PLEG): container finished" podID="0da87c5d-f709-4d9b-b182-421edbb61f00" containerID="decfdf9a27f78a38287e325560f9246d8ee33368520c37c1e0ab35b37e5ff1ae" exitCode=0 Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.017665 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican83f7-account-delete-5vbd2" event={"ID":"88802adc-d164-420b-98d2-a757b6627350","Type":"ContainerDied","Data":"c4860343d4ee81a640779864c9d8d1bbc93d0c65395dd5b02fec146d0aee40be"} Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.017715 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57876487f8-zgj8m" event={"ID":"600c553e-f8e5-4ec6-94e7-2981abc748cb","Type":"ContainerDied","Data":"9b2f931241d0299ec3b05ca17d1893191def91b2abcc6ab8e43b6350e1923813"} Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.017818 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c99cd79f9-qgnsc" event={"ID":"56e7dac8-1382-446b-88c7-47104b5a89cf","Type":"ContainerDied","Data":"4ba713996b77f2d4999979b9b73f7452cf113d5729d70af3dbe6b23cede34a3b"} Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.017841 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73ed8df4-e28f-4c76-bca1-a3d77ef789d4","Type":"ContainerDied","Data":"a9d2e5091d7aefc5b0d929913437d6ded33f50f234ba6c80915715daf68d74db"} Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.017860 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0e536-account-delete-zsxfl" event={"ID":"e822e9c0-d6fa-4880-a0e3-8dfb32405a6f","Type":"ContainerDied","Data":"d86765cb79f89a9cbc7de973c711eb1d6f3dd9f41b2b18a025ac97233e7bc3a7"} Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.017876 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"48a868ce-c1ab-457a-bd7f-224f8e982a13","Type":"ContainerDied","Data":"84d0dbf76c2ff2ed4c9dc4cbbf8300c430928bdea45eabd90653242d68969b38"} Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.017893 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0452bd55-b0a3-46a9-a388-6db2e40f4cb7","Type":"ContainerDied","Data":"6d7b68aa247edb856bd53e3fca4235d3708496404ba738f128d0babd51b390df"} Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.017909 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi03c2-account-delete-4b54q" event={"ID":"0da87c5d-f709-4d9b-b182-421edbb61f00","Type":"ContainerDied","Data":"decfdf9a27f78a38287e325560f9246d8ee33368520c37c1e0ab35b37e5ff1ae"} Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.040964 4745 generic.go:334] "Generic (PLEG): container finished" podID="4658eac8-46b0-448b-8bc7-7c783fcef1c6" containerID="84c6b918c9652ae6342e8c33b38c40a7337bb5caf0afd25ca43e70b2c29d044c" exitCode=0 Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.041125 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" event={"ID":"4658eac8-46b0-448b-8bc7-7c783fcef1c6","Type":"ContainerDied","Data":"84c6b918c9652ae6342e8c33b38c40a7337bb5caf0afd25ca43e70b2c29d044c"} Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.050528 4745 generic.go:334] "Generic (PLEG): container finished" podID="1b87df09-ea26-4c97-bcd6-4ee7c6250d00" containerID="292c118e9ea99e79f59b1386147eb09c4d2872305f317b469d80803ac95abb15" exitCode=0 Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.050698 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4957-account-delete-b6ngs" event={"ID":"1b87df09-ea26-4c97-bcd6-4ee7c6250d00","Type":"ContainerDied","Data":"292c118e9ea99e79f59b1386147eb09c4d2872305f317b469d80803ac95abb15"} Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.074026 4745 generic.go:334] "Generic (PLEG): container finished" podID="25eecd8f-8b17-4e74-b651-78948c627127" containerID="7fbc4fcdff7364e87c5838ace4d2c29c49b120562cd1441f9f1df819403a2bde" exitCode=0 Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.074138 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7db6b497c6-z9vtr" event={"ID":"25eecd8f-8b17-4e74-b651-78948c627127","Type":"ContainerDied","Data":"7fbc4fcdff7364e87c5838ace4d2c29c49b120562cd1441f9f1df819403a2bde"} Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.074175 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7db6b497c6-z9vtr" event={"ID":"25eecd8f-8b17-4e74-b651-78948c627127","Type":"ContainerDied","Data":"92f474fd86deaf26a3ebed0f6f198818937e580e74a9f872d9b4abf33086e260"} Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.074265 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7db6b497c6-z9vtr" Dec 09 11:57:50 crc kubenswrapper[4745]: E1209 11:57:50.079336 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.085140 4745 generic.go:334] "Generic (PLEG): container finished" podID="c9e009b8-8eec-4028-ade9-84bc49d236c8" containerID="4d0d0764d9f0fe5167222401d27b58294f16f2e351cb70a946140ba51e127793" exitCode=0 Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.085285 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9e009b8-8eec-4028-ade9-84bc49d236c8","Type":"ContainerDied","Data":"4d0d0764d9f0fe5167222401d27b58294f16f2e351cb70a946140ba51e127793"} Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.114244 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5c99cd79f9-qgnsc"] Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.119079 4745 scope.go:117] "RemoveContainer" containerID="ebc3184d8235e913b37b06d481dcea6c9617fbc8229add4be26ceee31189956c" Dec 09 11:57:50 crc kubenswrapper[4745]: E1209 11:57:50.125407 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 09 11:57:50 crc kubenswrapper[4745]: E1209 11:57:50.125575 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc3184d8235e913b37b06d481dcea6c9617fbc8229add4be26ceee31189956c\": container with ID starting with ebc3184d8235e913b37b06d481dcea6c9617fbc8229add4be26ceee31189956c not found: ID does not exist" containerID="ebc3184d8235e913b37b06d481dcea6c9617fbc8229add4be26ceee31189956c" Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.125605 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc3184d8235e913b37b06d481dcea6c9617fbc8229add4be26ceee31189956c"} err="failed to get container status \"ebc3184d8235e913b37b06d481dcea6c9617fbc8229add4be26ceee31189956c\": rpc error: code = NotFound desc = could not find container \"ebc3184d8235e913b37b06d481dcea6c9617fbc8229add4be26ceee31189956c\": container with ID starting with ebc3184d8235e913b37b06d481dcea6c9617fbc8229add4be26ceee31189956c not found: ID does not exist" Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.125632 4745 scope.go:117] "RemoveContainer" containerID="f2eada94a0c5b273d35a92a2faa4601639f853c423ecc3e3abc9140ff66f94e3" Dec 09 11:57:50 crc kubenswrapper[4745]: E1209 11:57:50.131309 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2eada94a0c5b273d35a92a2faa4601639f853c423ecc3e3abc9140ff66f94e3\": container with ID starting with f2eada94a0c5b273d35a92a2faa4601639f853c423ecc3e3abc9140ff66f94e3 not found: ID does not exist" containerID="f2eada94a0c5b273d35a92a2faa4601639f853c423ecc3e3abc9140ff66f94e3" Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.131334 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2eada94a0c5b273d35a92a2faa4601639f853c423ecc3e3abc9140ff66f94e3"} err="failed to get container status \"f2eada94a0c5b273d35a92a2faa4601639f853c423ecc3e3abc9140ff66f94e3\": rpc error: code = NotFound desc = could not find container \"f2eada94a0c5b273d35a92a2faa4601639f853c423ecc3e3abc9140ff66f94e3\": container with ID starting with f2eada94a0c5b273d35a92a2faa4601639f853c423ecc3e3abc9140ff66f94e3 not found: ID does not exist" Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.131355 4745 scope.go:117] "RemoveContainer" containerID="5b3b68afb88cc6f2e0bb3746cc7c964f36672819d43cd68d83781dceeba6aa46" Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.134335 4745 generic.go:334] "Generic (PLEG): container finished" podID="c180eac0-e93f-4067-ba6d-32a023f424e6" containerID="2b371925777f6434276ab7a74c14a8e465cf580329fd7735e425f6016bedfb3b" exitCode=0 Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.134810 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c180eac0-e93f-4067-ba6d-32a023f424e6","Type":"ContainerDied","Data":"2b371925777f6434276ab7a74c14a8e465cf580329fd7735e425f6016bedfb3b"} Dec 09 11:57:50 crc kubenswrapper[4745]: E1209 11:57:50.135437 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 09 11:57:50 crc kubenswrapper[4745]: E1209 11:57:50.135493 4745 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="8b9f661d-4261-4e54-883d-cb0e7479a3d2" containerName="ovn-northd" Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.173729 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-5c99cd79f9-qgnsc"] Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.332097 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.332977 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerName="ceilometer-central-agent" containerID="cri-o://5fd672e48430bca4d1c2a71daa80ccf27bd2fcb9360ab8501dfd3906ec36e53e" gracePeriod=30 Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.333241 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerName="proxy-httpd" containerID="cri-o://29da6f720b0b04eb76a45efbd214f1c1f0273bc06fc58a52d1909cd993ab7077" gracePeriod=30 Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.333487 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerName="ceilometer-notification-agent" containerID="cri-o://c29448a85553503fc1fcb48e56752ab6bca8a6dd9078d1c9f160ca8db10b194d" gracePeriod=30 Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.333589 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerName="sg-core" containerID="cri-o://2e2d1a1072494cde394a0d7349ffb4decaf89624f4008fb775c18a80d492e831" gracePeriod=30 Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.413102 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.413391 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a506f944-5b99-48af-a714-e24782ba1c06" containerName="kube-state-metrics" containerID="cri-o://84094740573a16863643bdc2da525be2a9ddb0b73b675078b658770516cc0d5e" gracePeriod=30 Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.561756 4745 scope.go:117] "RemoveContainer" containerID="324e724346a04d819890081d5a91493f6e1feb208d14a31f346c568761a5cbf4" Dec 09 11:57:50 crc kubenswrapper[4745]: E1209 11:57:50.600390 4745 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 09 11:57:50 crc kubenswrapper[4745]: E1209 11:57:50.600539 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-config-data podName:ceba626e-26d1-495f-b88d-fed69e445ddb nodeName:}" failed. No retries permitted until 2025-12-09 11:57:58.600486606 +0000 UTC m=+1565.425688140 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-config-data") pod "rabbitmq-cell1-server-0" (UID: "ceba626e-26d1-495f-b88d-fed69e445ddb") : configmap "rabbitmq-cell1-config-data" not found Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.603950 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.654825 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7db6b497c6-z9vtr"] Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.721605 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7db6b497c6-z9vtr"] Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.740794 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-combined-ca-bundle\") pod \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.740892 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-httpd-run\") pod \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.740964 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.741025 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-logs\") pod \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.741103 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-config-data\") pod \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.741165 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-scripts\") pod \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.741330 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztgzt\" (UniqueName: \"kubernetes.io/projected/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-kube-api-access-ztgzt\") pod \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.741385 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-internal-tls-certs\") pod \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\" (UID: \"0452bd55-b0a3-46a9-a388-6db2e40f4cb7\") " Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.745185 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-logs" (OuterVolumeSpecName: "logs") pod "0452bd55-b0a3-46a9-a388-6db2e40f4cb7" (UID: "0452bd55-b0a3-46a9-a388-6db2e40f4cb7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:50 crc kubenswrapper[4745]: I1209 11:57:50.764416 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.773470 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0452bd55-b0a3-46a9-a388-6db2e40f4cb7" (UID: "0452bd55-b0a3-46a9-a388-6db2e40f4cb7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.798202 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-kube-api-access-ztgzt" (OuterVolumeSpecName: "kube-api-access-ztgzt") pod "0452bd55-b0a3-46a9-a388-6db2e40f4cb7" (UID: "0452bd55-b0a3-46a9-a388-6db2e40f4cb7"). InnerVolumeSpecName "kube-api-access-ztgzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.854354 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.854689 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="84ab78e7-7419-4892-92c0-085db552be56" containerName="memcached" containerID="cri-o://165fc14fd97a535329fc700bc5ee13ff68c82e069d90770b464ef7ed62a40419" gracePeriod=30 Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.856122 4745 scope.go:117] "RemoveContainer" containerID="7fbc4fcdff7364e87c5838ace4d2c29c49b120562cd1441f9f1df819403a2bde" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.856340 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "0452bd55-b0a3-46a9-a388-6db2e40f4cb7" (UID: "0452bd55-b0a3-46a9-a388-6db2e40f4cb7"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.856725 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-scripts" (OuterVolumeSpecName: "scripts") pod "0452bd55-b0a3-46a9-a388-6db2e40f4cb7" (UID: "0452bd55-b0a3-46a9-a388-6db2e40f4cb7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.880487 4745 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.880537 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.880546 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztgzt\" (UniqueName: \"kubernetes.io/projected/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-kube-api-access-ztgzt\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.880560 4745 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.883922 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vxcd9"] Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.892380 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-p9tff"] Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.892965 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0452bd55-b0a3-46a9-a388-6db2e40f4cb7" (UID: "0452bd55-b0a3-46a9-a388-6db2e40f4cb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.902686 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-p9tff"] Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.910073 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vxcd9"] Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.917738 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone5bf8-account-delete-c4n86"] Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:50.918291 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25eecd8f-8b17-4e74-b651-78948c627127" containerName="placement-api" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918305 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="25eecd8f-8b17-4e74-b651-78948c627127" containerName="placement-api" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:50.918316 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2193006f-4e85-4b55-a6ab-9237f4c9888f" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918324 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2193006f-4e85-4b55-a6ab-9237f4c9888f" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:50.918341 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e7dac8-1382-446b-88c7-47104b5a89cf" containerName="proxy-httpd" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918347 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e7dac8-1382-446b-88c7-47104b5a89cf" containerName="proxy-httpd" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:50.918371 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0452bd55-b0a3-46a9-a388-6db2e40f4cb7" containerName="glance-httpd" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918378 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0452bd55-b0a3-46a9-a388-6db2e40f4cb7" containerName="glance-httpd" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:50.918389 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52964bb1-2d93-4df7-afbc-95f1eb10b8fc" containerName="openstack-network-exporter" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918395 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="52964bb1-2d93-4df7-afbc-95f1eb10b8fc" containerName="openstack-network-exporter" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:50.918402 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f843182-a85c-47cf-ba16-414c40a031c5" containerName="dnsmasq-dns" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918408 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f843182-a85c-47cf-ba16-414c40a031c5" containerName="dnsmasq-dns" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:50.918418 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25eecd8f-8b17-4e74-b651-78948c627127" containerName="placement-log" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918423 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="25eecd8f-8b17-4e74-b651-78948c627127" containerName="placement-log" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:50.918433 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e18fbbd-a304-422e-8c13-88ab08fef424" containerName="ovsdbserver-sb" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918439 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e18fbbd-a304-422e-8c13-88ab08fef424" containerName="ovsdbserver-sb" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:50.918449 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a88ab6-2793-4952-b04c-9041a15e83f9" containerName="galera" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918454 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a88ab6-2793-4952-b04c-9041a15e83f9" containerName="galera" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:50.918467 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e7dac8-1382-446b-88c7-47104b5a89cf" containerName="proxy-server" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918473 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e7dac8-1382-446b-88c7-47104b5a89cf" containerName="proxy-server" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:50.918483 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e18fbbd-a304-422e-8c13-88ab08fef424" containerName="openstack-network-exporter" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918489 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e18fbbd-a304-422e-8c13-88ab08fef424" containerName="openstack-network-exporter" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:50.918498 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0452bd55-b0a3-46a9-a388-6db2e40f4cb7" containerName="glance-log" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918521 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0452bd55-b0a3-46a9-a388-6db2e40f4cb7" containerName="glance-log" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:50.918532 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f843182-a85c-47cf-ba16-414c40a031c5" containerName="init" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918538 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f843182-a85c-47cf-ba16-414c40a031c5" containerName="init" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:50.918554 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a88ab6-2793-4952-b04c-9041a15e83f9" containerName="mysql-bootstrap" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918564 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a88ab6-2793-4952-b04c-9041a15e83f9" containerName="mysql-bootstrap" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:50.918581 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcea58c8-1e21-4581-bcdb-7b1f88e8b463" containerName="ovn-controller" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918588 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcea58c8-1e21-4581-bcdb-7b1f88e8b463" containerName="ovn-controller" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918778 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="2193006f-4e85-4b55-a6ab-9237f4c9888f" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918808 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e7dac8-1382-446b-88c7-47104b5a89cf" containerName="proxy-httpd" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918816 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e18fbbd-a304-422e-8c13-88ab08fef424" containerName="openstack-network-exporter" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918828 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e18fbbd-a304-422e-8c13-88ab08fef424" containerName="ovsdbserver-sb" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918839 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="25eecd8f-8b17-4e74-b651-78948c627127" containerName="placement-api" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918851 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e7dac8-1382-446b-88c7-47104b5a89cf" containerName="proxy-server" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918860 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="0452bd55-b0a3-46a9-a388-6db2e40f4cb7" containerName="glance-httpd" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918867 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a88ab6-2793-4952-b04c-9041a15e83f9" containerName="galera" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918876 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="52964bb1-2d93-4df7-afbc-95f1eb10b8fc" containerName="openstack-network-exporter" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918890 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f843182-a85c-47cf-ba16-414c40a031c5" containerName="dnsmasq-dns" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918903 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="25eecd8f-8b17-4e74-b651-78948c627127" containerName="placement-log" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.918919 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="0452bd55-b0a3-46a9-a388-6db2e40f4cb7" containerName="glance-log" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.919776 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone5bf8-account-delete-c4n86" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.949861 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-56c9f47db8-wmdgz"] Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.950121 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-56c9f47db8-wmdgz" podUID="ff59337d-f366-446b-9752-eb371ee468e4" containerName="keystone-api" containerID="cri-o://e26931e675e3e053e3c67af648d2433105486188c6e62356a6e8b58996656df4" gracePeriod=30 Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.969875 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone5bf8-account-delete-c4n86"] Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.984941 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.988160 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqq2v\" (UniqueName: \"kubernetes.io/projected/a7755562-a5b1-4b4d-833c-5a3179a2926c-kube-api-access-qqq2v\") pod \"keystone5bf8-account-delete-c4n86\" (UID: \"a7755562-a5b1-4b4d-833c-5a3179a2926c\") " pod="openstack/keystone5bf8-account-delete-c4n86" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.988347 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7755562-a5b1-4b4d-833c-5a3179a2926c-operator-scripts\") pod \"keystone5bf8-account-delete-c4n86\" (UID: \"a7755562-a5b1-4b4d-833c-5a3179a2926c\") " pod="openstack/keystone5bf8-account-delete-c4n86" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:50.988455 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.029867 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-r4f68"] Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.055122 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-r4f68"] Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.076489 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone5bf8-account-delete-c4n86"] Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.085562 4745 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.090619 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7755562-a5b1-4b4d-833c-5a3179a2926c-operator-scripts\") pod \"keystone5bf8-account-delete-c4n86\" (UID: \"a7755562-a5b1-4b4d-833c-5a3179a2926c\") " pod="openstack/keystone5bf8-account-delete-c4n86" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.090775 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqq2v\" (UniqueName: \"kubernetes.io/projected/a7755562-a5b1-4b4d-833c-5a3179a2926c-kube-api-access-qqq2v\") pod \"keystone5bf8-account-delete-c4n86\" (UID: \"a7755562-a5b1-4b4d-833c-5a3179a2926c\") " pod="openstack/keystone5bf8-account-delete-c4n86" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.090881 4745 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.090965 4745 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.091117 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7755562-a5b1-4b4d-833c-5a3179a2926c-operator-scripts podName:a7755562-a5b1-4b4d-833c-5a3179a2926c nodeName:}" failed. No retries permitted until 2025-12-09 11:57:51.591087445 +0000 UTC m=+1558.416288969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a7755562-a5b1-4b4d-833c-5a3179a2926c-operator-scripts") pod "keystone5bf8-account-delete-c4n86" (UID: "a7755562-a5b1-4b4d-833c-5a3179a2926c") : configmap "openstack-scripts" not found Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.097545 4745 projected.go:194] Error preparing data for projected volume kube-api-access-qqq2v for pod openstack/keystone5bf8-account-delete-c4n86: failed to fetch token: serviceaccounts "galera-openstack" not found Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.097944 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a7755562-a5b1-4b4d-833c-5a3179a2926c-kube-api-access-qqq2v podName:a7755562-a5b1-4b4d-833c-5a3179a2926c nodeName:}" failed. No retries permitted until 2025-12-09 11:57:51.597912889 +0000 UTC m=+1558.423114413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qqq2v" (UniqueName: "kubernetes.io/projected/a7755562-a5b1-4b4d-833c-5a3179a2926c-kube-api-access-qqq2v") pod "keystone5bf8-account-delete-c4n86" (UID: "a7755562-a5b1-4b4d-833c-5a3179a2926c") : failed to fetch token: serviceaccounts "galera-openstack" not found Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.101254 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5bf8-account-create-update-pxtd8"] Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.108051 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5bf8-account-create-update-pxtd8"] Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.152872 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda94483d_f361_42ef_95b4_d4b2c79b4d80.slice/crio-09651a3667881d281a4da0af523c62fd762af27f395c545a4af9df337fa048bd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56e7dac8_1382_446b_88c7_47104b5a89cf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48a868ce_c1ab_457a_bd7f_224f8e982a13.slice/crio-84d0dbf76c2ff2ed4c9dc4cbbf8300c430928bdea45eabd90653242d68969b38.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaef3c48_5e7c_4ea3_a2d0_da44ea528455.slice/crio-conmon-85b41109811ea6cff92fc56baafb1484f3284cd5b7dfd75355171bc8db15a025.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f786d16_8a6e_420b_b2b7_f785386e2191.slice/crio-conmon-709a10ef722e6bd115a3fbe87e5cae6688e5855a823df3b981535f201d001778.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41d93c20_efbc_41c3_bea1_7e7dad1ae70d.slice/crio-2e2d1a1072494cde394a0d7349ffb4decaf89624f4008fb775c18a80d492e831.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73ed8df4_e28f_4c76_bca1_a3d77ef789d4.slice/crio-a9d2e5091d7aefc5b0d929913437d6ded33f50f234ba6c80915715daf68d74db.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda94483d_f361_42ef_95b4_d4b2c79b4d80.slice/crio-conmon-09651a3667881d281a4da0af523c62fd762af27f395c545a4af9df337fa048bd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56e7dac8_1382_446b_88c7_47104b5a89cf.slice/crio-4ba713996b77f2d4999979b9b73f7452cf113d5729d70af3dbe6b23cede34a3b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25eecd8f_8b17_4e74_b651_78948c627127.slice\": RecentStats: unable to find data in memory cache]" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.190600 4745 generic.go:334] "Generic (PLEG): container finished" podID="392b878a-37ba-4887-a699-672c8b92e947" containerID="4362cdcbe28cc87f9a30f3b6d968ff0b65e8991acf95fb8eeec6d303cfb21ea1" exitCode=0 Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.190691 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"392b878a-37ba-4887-a699-672c8b92e947","Type":"ContainerDied","Data":"4362cdcbe28cc87f9a30f3b6d968ff0b65e8991acf95fb8eeec6d303cfb21ea1"} Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.193707 4745 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.193764 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-config-data podName:5b860955-30eb-40e6-bd56-caf6098aed8a nodeName:}" failed. No retries permitted until 2025-12-09 11:57:59.193747833 +0000 UTC m=+1566.018949357 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-config-data") pod "rabbitmq-server-0" (UID: "5b860955-30eb-40e6-bd56-caf6098aed8a") : configmap "rabbitmq-config-data" not found Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.196291 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"48a868ce-c1ab-457a-bd7f-224f8e982a13","Type":"ContainerDied","Data":"c36983d61b2699871ca1ed37f3d5714e01df50b267fc939dd2ffc2d0e03bdef2"} Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.196346 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c36983d61b2699871ca1ed37f3d5714e01df50b267fc939dd2ffc2d0e03bdef2" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.201974 4745 scope.go:117] "RemoveContainer" containerID="c1072a6b7c6b1a8d03366bc7d0ddbbd40587331a7fd99d03c872e65df3f9946a" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.222022 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-config-data" (OuterVolumeSpecName: "config-data") pod "0452bd55-b0a3-46a9-a388-6db2e40f4cb7" (UID: "0452bd55-b0a3-46a9-a388-6db2e40f4cb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.241932 4745 generic.go:334] "Generic (PLEG): container finished" podID="da94483d-f361-42ef-95b4-d4b2c79b4d80" containerID="09651a3667881d281a4da0af523c62fd762af27f395c545a4af9df337fa048bd" exitCode=0 Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.242052 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cf679bcc-g65zr" event={"ID":"da94483d-f361-42ef-95b4-d4b2c79b4d80","Type":"ContainerDied","Data":"09651a3667881d281a4da0af523c62fd762af27f395c545a4af9df337fa048bd"} Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.243910 4745 generic.go:334] "Generic (PLEG): container finished" podID="a506f944-5b99-48af-a714-e24782ba1c06" containerID="84094740573a16863643bdc2da525be2a9ddb0b73b675078b658770516cc0d5e" exitCode=2 Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.243953 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a506f944-5b99-48af-a714-e24782ba1c06","Type":"ContainerDied","Data":"84094740573a16863643bdc2da525be2a9ddb0b73b675078b658770516cc0d5e"} Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.251032 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9e009b8-8eec-4028-ade9-84bc49d236c8","Type":"ContainerDied","Data":"7fc3a9b03537f8a84a8d51c00b0cac0413c631e73c2ee7cb3b249c44da01c4e7"} Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.251100 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fc3a9b03537f8a84a8d51c00b0cac0413c631e73c2ee7cb3b249c44da01c4e7" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.254142 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0452bd55-b0a3-46a9-a388-6db2e40f4cb7","Type":"ContainerDied","Data":"3d993fc02b6d7627f5ebff8193e42959f19300f1c6d87f7ca61384cdbda0b025"} Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.254254 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.257087 4745 generic.go:334] "Generic (PLEG): container finished" podID="0f786d16-8a6e-420b-b2b7-f785386e2191" containerID="709a10ef722e6bd115a3fbe87e5cae6688e5855a823df3b981535f201d001778" exitCode=0 Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.257172 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0f786d16-8a6e-420b-b2b7-f785386e2191","Type":"ContainerDied","Data":"709a10ef722e6bd115a3fbe87e5cae6688e5855a823df3b981535f201d001778"} Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.279623 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c180eac0-e93f-4067-ba6d-32a023f424e6","Type":"ContainerDied","Data":"de2677d35a9062cecaef36be6523796776d3310b2806e5606298434ebb2fe652"} Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.279673 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de2677d35a9062cecaef36be6523796776d3310b2806e5606298434ebb2fe652" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.284494 4745 generic.go:334] "Generic (PLEG): container finished" podID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerID="29da6f720b0b04eb76a45efbd214f1c1f0273bc06fc58a52d1909cd993ab7077" exitCode=0 Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.284552 4745 generic.go:334] "Generic (PLEG): container finished" podID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerID="2e2d1a1072494cde394a0d7349ffb4decaf89624f4008fb775c18a80d492e831" exitCode=2 Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.284597 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41d93c20-efbc-41c3-bea1-7e7dad1ae70d","Type":"ContainerDied","Data":"29da6f720b0b04eb76a45efbd214f1c1f0273bc06fc58a52d1909cd993ab7077"} Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.284620 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41d93c20-efbc-41c3-bea1-7e7dad1ae70d","Type":"ContainerDied","Data":"2e2d1a1072494cde394a0d7349ffb4decaf89624f4008fb775c18a80d492e831"} Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.305623 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0452bd55-b0a3-46a9-a388-6db2e40f4cb7" (UID: "0452bd55-b0a3-46a9-a388-6db2e40f4cb7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.306097 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.306134 4745 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0452bd55-b0a3-46a9-a388-6db2e40f4cb7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.306135 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.309955 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.310225 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.311664 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.311749 4745 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-zdn2w" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovsdb-server" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.312000 4745 generic.go:334] "Generic (PLEG): container finished" podID="eaef3c48-5e7c-4ea3-a2d0-da44ea528455" containerID="85b41109811ea6cff92fc56baafb1484f3284cd5b7dfd75355171bc8db15a025" exitCode=0 Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.312748 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"eaef3c48-5e7c-4ea3-a2d0-da44ea528455","Type":"ContainerDied","Data":"85b41109811ea6cff92fc56baafb1484f3284cd5b7dfd75355171bc8db15a025"} Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.318150 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.365794 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.365916 4745 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-zdn2w" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovs-vswitchd" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.457321 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-qqq2v operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone5bf8-account-delete-c4n86" podUID="a7755562-a5b1-4b4d-833c-5a3179a2926c" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.500557 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.519155 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.519473 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.520432 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.533792 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.561846 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.628290 4745 scope.go:117] "RemoveContainer" containerID="7fbc4fcdff7364e87c5838ace4d2c29c49b120562cd1441f9f1df819403a2bde" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.629140 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dszll\" (UniqueName: \"kubernetes.io/projected/c180eac0-e93f-4067-ba6d-32a023f424e6-kube-api-access-dszll\") pod \"c180eac0-e93f-4067-ba6d-32a023f424e6\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.629172 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-public-tls-certs\") pod \"c9e009b8-8eec-4028-ade9-84bc49d236c8\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.629198 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52wnm\" (UniqueName: \"kubernetes.io/projected/c9e009b8-8eec-4028-ade9-84bc49d236c8-kube-api-access-52wnm\") pod \"c9e009b8-8eec-4028-ade9-84bc49d236c8\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.629234 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-combined-ca-bundle\") pod \"c9e009b8-8eec-4028-ade9-84bc49d236c8\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.629257 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-public-tls-certs\") pod \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.629322 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-public-tls-certs\") pod \"48a868ce-c1ab-457a-bd7f-224f8e982a13\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.629348 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-combined-ca-bundle\") pod \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.629396 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-combined-ca-bundle\") pod \"600c553e-f8e5-4ec6-94e7-2981abc748cb\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.629439 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48a868ce-c1ab-457a-bd7f-224f8e982a13-etc-machine-id\") pod \"48a868ce-c1ab-457a-bd7f-224f8e982a13\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.629461 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-scripts\") pod \"48a868ce-c1ab-457a-bd7f-224f8e982a13\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.629480 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-config-data\") pod \"600c553e-f8e5-4ec6-94e7-2981abc748cb\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.629525 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/600c553e-f8e5-4ec6-94e7-2981abc748cb-logs\") pod \"600c553e-f8e5-4ec6-94e7-2981abc748cb\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630503 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e009b8-8eec-4028-ade9-84bc49d236c8-logs\") pod \"c9e009b8-8eec-4028-ade9-84bc49d236c8\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630542 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-combined-ca-bundle\") pod \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630562 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-config-data\") pod \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630585 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630616 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g76qp\" (UniqueName: \"kubernetes.io/projected/600c553e-f8e5-4ec6-94e7-2981abc748cb-kube-api-access-g76qp\") pod \"600c553e-f8e5-4ec6-94e7-2981abc748cb\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630645 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-scripts\") pod \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630666 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48a868ce-c1ab-457a-bd7f-224f8e982a13-logs\") pod \"48a868ce-c1ab-457a-bd7f-224f8e982a13\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630687 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c180eac0-e93f-4067-ba6d-32a023f424e6-logs\") pod \"c180eac0-e93f-4067-ba6d-32a023f424e6\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630706 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-internal-tls-certs\") pod \"c9e009b8-8eec-4028-ade9-84bc49d236c8\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630732 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4658eac8-46b0-448b-8bc7-7c783fcef1c6-logs\") pod \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630759 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-combined-ca-bundle\") pod \"48a868ce-c1ab-457a-bd7f-224f8e982a13\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630779 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-config-data\") pod \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630821 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxrbr\" (UniqueName: \"kubernetes.io/projected/48a868ce-c1ab-457a-bd7f-224f8e982a13-kube-api-access-cxrbr\") pod \"48a868ce-c1ab-457a-bd7f-224f8e982a13\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630820 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25eecd8f-8b17-4e74-b651-78948c627127" path="/var/lib/kubelet/pods/25eecd8f-8b17-4e74-b651-78948c627127/volumes" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630844 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-combined-ca-bundle\") pod \"c180eac0-e93f-4067-ba6d-32a023f424e6\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630866 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-config-data\") pod \"48a868ce-c1ab-457a-bd7f-224f8e982a13\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630893 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-config-data-custom\") pod \"600c553e-f8e5-4ec6-94e7-2981abc748cb\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630914 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-nova-metadata-tls-certs\") pod \"c180eac0-e93f-4067-ba6d-32a023f424e6\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.630981 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-public-tls-certs\") pod \"600c553e-f8e5-4ec6-94e7-2981abc748cb\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.631197 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-httpd-run\") pod \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.631224 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ggwb\" (UniqueName: \"kubernetes.io/projected/4658eac8-46b0-448b-8bc7-7c783fcef1c6-kube-api-access-2ggwb\") pod \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.631267 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8pnj\" (UniqueName: \"kubernetes.io/projected/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-kube-api-access-z8pnj\") pod \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.631318 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-config-data\") pod \"c180eac0-e93f-4067-ba6d-32a023f424e6\" (UID: \"c180eac0-e93f-4067-ba6d-32a023f424e6\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.631342 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-internal-tls-certs\") pod \"600c553e-f8e5-4ec6-94e7-2981abc748cb\" (UID: \"600c553e-f8e5-4ec6-94e7-2981abc748cb\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.631364 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-config-data\") pod \"c9e009b8-8eec-4028-ade9-84bc49d236c8\" (UID: \"c9e009b8-8eec-4028-ade9-84bc49d236c8\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.631384 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-internal-tls-certs\") pod \"48a868ce-c1ab-457a-bd7f-224f8e982a13\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.631405 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-logs\") pod \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\" (UID: \"73ed8df4-e28f-4c76-bca1-a3d77ef789d4\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.631424 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-config-data-custom\") pod \"48a868ce-c1ab-457a-bd7f-224f8e982a13\" (UID: \"48a868ce-c1ab-457a-bd7f-224f8e982a13\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.631467 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-config-data-custom\") pod \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\" (UID: \"4658eac8-46b0-448b-8bc7-7c783fcef1c6\") " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.633015 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29c95beb-2818-4a46-813c-b53fedce2a59" path="/var/lib/kubelet/pods/29c95beb-2818-4a46-813c-b53fedce2a59/volumes" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.633310 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fbc4fcdff7364e87c5838ace4d2c29c49b120562cd1441f9f1df819403a2bde\": container with ID starting with 7fbc4fcdff7364e87c5838ace4d2c29c49b120562cd1441f9f1df819403a2bde not found: ID does not exist" containerID="7fbc4fcdff7364e87c5838ace4d2c29c49b120562cd1441f9f1df819403a2bde" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.633380 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fbc4fcdff7364e87c5838ace4d2c29c49b120562cd1441f9f1df819403a2bde"} err="failed to get container status \"7fbc4fcdff7364e87c5838ace4d2c29c49b120562cd1441f9f1df819403a2bde\": rpc error: code = NotFound desc = could not find container \"7fbc4fcdff7364e87c5838ace4d2c29c49b120562cd1441f9f1df819403a2bde\": container with ID starting with 7fbc4fcdff7364e87c5838ace4d2c29c49b120562cd1441f9f1df819403a2bde not found: ID does not exist" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.633421 4745 scope.go:117] "RemoveContainer" containerID="c1072a6b7c6b1a8d03366bc7d0ddbbd40587331a7fd99d03c872e65df3f9946a" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.633623 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="311eddda-625a-4029-ba56-b408b2242eb5" path="/var/lib/kubelet/pods/311eddda-625a-4029-ba56-b408b2242eb5/volumes" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.634180 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7755562-a5b1-4b4d-833c-5a3179a2926c-operator-scripts\") pod \"keystone5bf8-account-delete-c4n86\" (UID: \"a7755562-a5b1-4b4d-833c-5a3179a2926c\") " pod="openstack/keystone5bf8-account-delete-c4n86" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.634349 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqq2v\" (UniqueName: \"kubernetes.io/projected/a7755562-a5b1-4b4d-833c-5a3179a2926c-kube-api-access-qqq2v\") pod \"keystone5bf8-account-delete-c4n86\" (UID: \"a7755562-a5b1-4b4d-833c-5a3179a2926c\") " pod="openstack/keystone5bf8-account-delete-c4n86" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.634470 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "73ed8df4-e28f-4c76-bca1-a3d77ef789d4" (UID: "73ed8df4-e28f-4c76-bca1-a3d77ef789d4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.635368 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56e7dac8-1382-446b-88c7-47104b5a89cf" path="/var/lib/kubelet/pods/56e7dac8-1382-446b-88c7-47104b5a89cf/volumes" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.636129 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9775e8b1-d49b-42eb-9941-5de54e89f465" path="/var/lib/kubelet/pods/9775e8b1-d49b-42eb-9941-5de54e89f465/volumes" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.637063 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca1b624b-3936-4cec-a1ba-b4efa1504020" path="/var/lib/kubelet/pods/ca1b624b-3936-4cec-a1ba-b4efa1504020/volumes" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.641183 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1072a6b7c6b1a8d03366bc7d0ddbbd40587331a7fd99d03c872e65df3f9946a\": container with ID starting with c1072a6b7c6b1a8d03366bc7d0ddbbd40587331a7fd99d03c872e65df3f9946a not found: ID does not exist" containerID="c1072a6b7c6b1a8d03366bc7d0ddbbd40587331a7fd99d03c872e65df3f9946a" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.641219 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1072a6b7c6b1a8d03366bc7d0ddbbd40587331a7fd99d03c872e65df3f9946a"} err="failed to get container status \"c1072a6b7c6b1a8d03366bc7d0ddbbd40587331a7fd99d03c872e65df3f9946a\": rpc error: code = NotFound desc = could not find container \"c1072a6b7c6b1a8d03366bc7d0ddbbd40587331a7fd99d03c872e65df3f9946a\": container with ID starting with c1072a6b7c6b1a8d03366bc7d0ddbbd40587331a7fd99d03c872e65df3f9946a not found: ID does not exist" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.641249 4745 scope.go:117] "RemoveContainer" containerID="6d7b68aa247edb856bd53e3fca4235d3708496404ba738f128d0babd51b390df" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.656756 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48a868ce-c1ab-457a-bd7f-224f8e982a13-logs" (OuterVolumeSpecName: "logs") pod "48a868ce-c1ab-457a-bd7f-224f8e982a13" (UID: "48a868ce-c1ab-457a-bd7f-224f8e982a13"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.657074 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c180eac0-e93f-4067-ba6d-32a023f424e6-kube-api-access-dszll" (OuterVolumeSpecName: "kube-api-access-dszll") pod "c180eac0-e93f-4067-ba6d-32a023f424e6" (UID: "c180eac0-e93f-4067-ba6d-32a023f424e6"). InnerVolumeSpecName "kube-api-access-dszll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.657167 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e009b8-8eec-4028-ade9-84bc49d236c8-kube-api-access-52wnm" (OuterVolumeSpecName: "kube-api-access-52wnm") pod "c9e009b8-8eec-4028-ade9-84bc49d236c8" (UID: "c9e009b8-8eec-4028-ade9-84bc49d236c8"). InnerVolumeSpecName "kube-api-access-52wnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.660906 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c180eac0-e93f-4067-ba6d-32a023f424e6-logs" (OuterVolumeSpecName: "logs") pod "c180eac0-e93f-4067-ba6d-32a023f424e6" (UID: "c180eac0-e93f-4067-ba6d-32a023f424e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.662832 4745 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.662889 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7755562-a5b1-4b4d-833c-5a3179a2926c-operator-scripts podName:a7755562-a5b1-4b4d-833c-5a3179a2926c nodeName:}" failed. No retries permitted until 2025-12-09 11:57:52.662867352 +0000 UTC m=+1559.488068876 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a7755562-a5b1-4b4d-833c-5a3179a2926c-operator-scripts") pod "keystone5bf8-account-delete-c4n86" (UID: "a7755562-a5b1-4b4d-833c-5a3179a2926c") : configmap "openstack-scripts" not found Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.673741 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4658eac8-46b0-448b-8bc7-7c783fcef1c6-logs" (OuterVolumeSpecName: "logs") pod "4658eac8-46b0-448b-8bc7-7c783fcef1c6" (UID: "4658eac8-46b0-448b-8bc7-7c783fcef1c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.675606 4745 projected.go:194] Error preparing data for projected volume kube-api-access-qqq2v for pod openstack/keystone5bf8-account-delete-c4n86: failed to fetch token: serviceaccounts "galera-openstack" not found Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.675674 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a7755562-a5b1-4b4d-833c-5a3179a2926c-kube-api-access-qqq2v podName:a7755562-a5b1-4b4d-833c-5a3179a2926c nodeName:}" failed. No retries permitted until 2025-12-09 11:57:52.675653807 +0000 UTC m=+1559.500855331 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qqq2v" (UniqueName: "kubernetes.io/projected/a7755562-a5b1-4b4d-833c-5a3179a2926c-kube-api-access-qqq2v") pod "keystone5bf8-account-delete-c4n86" (UID: "a7755562-a5b1-4b4d-833c-5a3179a2926c") : failed to fetch token: serviceaccounts "galera-openstack" not found Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.685737 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4658eac8-46b0-448b-8bc7-7c783fcef1c6" (UID: "4658eac8-46b0-448b-8bc7-7c783fcef1c6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.686236 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-scripts" (OuterVolumeSpecName: "scripts") pod "73ed8df4-e28f-4c76-bca1-a3d77ef789d4" (UID: "73ed8df4-e28f-4c76-bca1-a3d77ef789d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.686721 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48a868ce-c1ab-457a-bd7f-224f8e982a13-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "48a868ce-c1ab-457a-bd7f-224f8e982a13" (UID: "48a868ce-c1ab-457a-bd7f-224f8e982a13"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.689111 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-logs" (OuterVolumeSpecName: "logs") pod "73ed8df4-e28f-4c76-bca1-a3d77ef789d4" (UID: "73ed8df4-e28f-4c76-bca1-a3d77ef789d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.689915 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/600c553e-f8e5-4ec6-94e7-2981abc748cb-logs" (OuterVolumeSpecName: "logs") pod "600c553e-f8e5-4ec6-94e7-2981abc748cb" (UID: "600c553e-f8e5-4ec6-94e7-2981abc748cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.691284 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e009b8-8eec-4028-ade9-84bc49d236c8-logs" (OuterVolumeSpecName: "logs") pod "c9e009b8-8eec-4028-ade9-84bc49d236c8" (UID: "c9e009b8-8eec-4028-ade9-84bc49d236c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.705321 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "73ed8df4-e28f-4c76-bca1-a3d77ef789d4" (UID: "73ed8df4-e28f-4c76-bca1-a3d77ef789d4"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.705335 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e8c79b90f4d2a11dc79650cd3c734e513694255f605827a249017e68a0d2eae" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.705521 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-kube-api-access-z8pnj" (OuterVolumeSpecName: "kube-api-access-z8pnj") pod "73ed8df4-e28f-4c76-bca1-a3d77ef789d4" (UID: "73ed8df4-e28f-4c76-bca1-a3d77ef789d4"). InnerVolumeSpecName "kube-api-access-z8pnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.707452 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e8c79b90f4d2a11dc79650cd3c734e513694255f605827a249017e68a0d2eae" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.708840 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e8c79b90f4d2a11dc79650cd3c734e513694255f605827a249017e68a0d2eae" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 11:57:51 crc kubenswrapper[4745]: E1209 11:57:51.708993 4745 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="5d388089-75a9-4e64-8fcf-575fde454708" containerName="nova-cell0-conductor-conductor" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.710020 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4658eac8-46b0-448b-8bc7-7c783fcef1c6-kube-api-access-2ggwb" (OuterVolumeSpecName: "kube-api-access-2ggwb") pod "4658eac8-46b0-448b-8bc7-7c783fcef1c6" (UID: "4658eac8-46b0-448b-8bc7-7c783fcef1c6"). InnerVolumeSpecName "kube-api-access-2ggwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.713435 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48a868ce-c1ab-457a-bd7f-224f8e982a13-kube-api-access-cxrbr" (OuterVolumeSpecName: "kube-api-access-cxrbr") pod "48a868ce-c1ab-457a-bd7f-224f8e982a13" (UID: "48a868ce-c1ab-457a-bd7f-224f8e982a13"). InnerVolumeSpecName "kube-api-access-cxrbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.716219 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "48a868ce-c1ab-457a-bd7f-224f8e982a13" (UID: "48a868ce-c1ab-457a-bd7f-224f8e982a13"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.716721 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "600c553e-f8e5-4ec6-94e7-2981abc748cb" (UID: "600c553e-f8e5-4ec6-94e7-2981abc748cb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.717111 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/600c553e-f8e5-4ec6-94e7-2981abc748cb-kube-api-access-g76qp" (OuterVolumeSpecName: "kube-api-access-g76qp") pod "600c553e-f8e5-4ec6-94e7-2981abc748cb" (UID: "600c553e-f8e5-4ec6-94e7-2981abc748cb"). InnerVolumeSpecName "kube-api-access-g76qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.755763 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-scripts" (OuterVolumeSpecName: "scripts") pod "48a868ce-c1ab-457a-bd7f-224f8e982a13" (UID: "48a868ce-c1ab-457a-bd7f-224f8e982a13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.766695 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxrbr\" (UniqueName: \"kubernetes.io/projected/48a868ce-c1ab-457a-bd7f-224f8e982a13-kube-api-access-cxrbr\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.766743 4745 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.766760 4745 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771156 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ggwb\" (UniqueName: \"kubernetes.io/projected/4658eac8-46b0-448b-8bc7-7c783fcef1c6-kube-api-access-2ggwb\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771217 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8pnj\" (UniqueName: \"kubernetes.io/projected/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-kube-api-access-z8pnj\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771229 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771274 4745 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771286 4745 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771295 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dszll\" (UniqueName: \"kubernetes.io/projected/c180eac0-e93f-4067-ba6d-32a023f424e6-kube-api-access-dszll\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771304 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52wnm\" (UniqueName: \"kubernetes.io/projected/c9e009b8-8eec-4028-ade9-84bc49d236c8-kube-api-access-52wnm\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771320 4745 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48a868ce-c1ab-457a-bd7f-224f8e982a13-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771353 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771364 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/600c553e-f8e5-4ec6-94e7-2981abc748cb-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771373 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e009b8-8eec-4028-ade9-84bc49d236c8-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771436 4745 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771448 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g76qp\" (UniqueName: \"kubernetes.io/projected/600c553e-f8e5-4ec6-94e7-2981abc748cb-kube-api-access-g76qp\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771456 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771471 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48a868ce-c1ab-457a-bd7f-224f8e982a13-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771514 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c180eac0-e93f-4067-ba6d-32a023f424e6-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.771528 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4658eac8-46b0-448b-8bc7-7c783fcef1c6-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.829361 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="94a3f188-f451-4895-b500-52a9f7877d00" containerName="galera" containerID="cri-o://38929371712d3d60dc685cf4a33c96454c4b248c83eff828bff29196ac2964c6" gracePeriod=30 Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.875270 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73ed8df4-e28f-4c76-bca1-a3d77ef789d4" (UID: "73ed8df4-e28f-4c76-bca1-a3d77ef789d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.892108 4745 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.939898 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-config-data" (OuterVolumeSpecName: "config-data") pod "c180eac0-e93f-4067-ba6d-32a023f424e6" (UID: "c180eac0-e93f-4067-ba6d-32a023f424e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.958378 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48a868ce-c1ab-457a-bd7f-224f8e982a13" (UID: "48a868ce-c1ab-457a-bd7f-224f8e982a13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.976657 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.976690 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.976701 4745 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.976712 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:51 crc kubenswrapper[4745]: I1209 11:57:51.985680 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c180eac0-e93f-4067-ba6d-32a023f424e6" (UID: "c180eac0-e93f-4067-ba6d-32a023f424e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.000126 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4658eac8-46b0-448b-8bc7-7c783fcef1c6" (UID: "4658eac8-46b0-448b-8bc7-7c783fcef1c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.011007 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "48a868ce-c1ab-457a-bd7f-224f8e982a13" (UID: "48a868ce-c1ab-457a-bd7f-224f8e982a13"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.027048 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c9e009b8-8eec-4028-ade9-84bc49d236c8" (UID: "c9e009b8-8eec-4028-ade9-84bc49d236c8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.029722 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c180eac0-e93f-4067-ba6d-32a023f424e6" (UID: "c180eac0-e93f-4067-ba6d-32a023f424e6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.035403 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-config-data" (OuterVolumeSpecName: "config-data") pod "c9e009b8-8eec-4028-ade9-84bc49d236c8" (UID: "c9e009b8-8eec-4028-ade9-84bc49d236c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.047369 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron1fc4-account-delete-rqj56" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.048475 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.049177 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.054989 4745 scope.go:117] "RemoveContainer" containerID="0d37cddfc8eb090792805f604b6641cd1c3d6607edd7eead42610629d259af0b" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.060752 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "600c553e-f8e5-4ec6-94e7-2981abc748cb" (UID: "600c553e-f8e5-4ec6-94e7-2981abc748cb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.083371 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g74lk\" (UniqueName: \"kubernetes.io/projected/f8d65ea6-5ea0-44fa-a4ab-82297d975a87-kube-api-access-g74lk\") pod \"f8d65ea6-5ea0-44fa-a4ab-82297d975a87\" (UID: \"f8d65ea6-5ea0-44fa-a4ab-82297d975a87\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.083620 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d65ea6-5ea0-44fa-a4ab-82297d975a87-operator-scripts\") pod \"f8d65ea6-5ea0-44fa-a4ab-82297d975a87\" (UID: \"f8d65ea6-5ea0-44fa-a4ab-82297d975a87\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.084324 4745 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.084372 4745 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.084394 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.084413 4745 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.084432 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.084449 4745 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.084466 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c180eac0-e93f-4067-ba6d-32a023f424e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.085503 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8d65ea6-5ea0-44fa-a4ab-82297d975a87-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8d65ea6-5ea0-44fa-a4ab-82297d975a87" (UID: "f8d65ea6-5ea0-44fa-a4ab-82297d975a87"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.089337 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-config-data" (OuterVolumeSpecName: "config-data") pod "600c553e-f8e5-4ec6-94e7-2981abc748cb" (UID: "600c553e-f8e5-4ec6-94e7-2981abc748cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.089466 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement980e-account-delete-85kkl" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.096204 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.096428 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.100470 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.117306 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8d65ea6-5ea0-44fa-a4ab-82297d975a87-kube-api-access-g74lk" (OuterVolumeSpecName: "kube-api-access-g74lk") pod "f8d65ea6-5ea0-44fa-a4ab-82297d975a87" (UID: "f8d65ea6-5ea0-44fa-a4ab-82297d975a87"). InnerVolumeSpecName "kube-api-access-g74lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.139747 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-config-data" (OuterVolumeSpecName: "config-data") pod "4658eac8-46b0-448b-8bc7-7c783fcef1c6" (UID: "4658eac8-46b0-448b-8bc7-7c783fcef1c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.186312 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kvzz\" (UniqueName: \"kubernetes.io/projected/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-kube-api-access-4kvzz\") pod \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.186360 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-etc-machine-id\") pod \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.186453 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scr5f\" (UniqueName: \"kubernetes.io/projected/da94483d-f361-42ef-95b4-d4b2c79b4d80-kube-api-access-scr5f\") pod \"da94483d-f361-42ef-95b4-d4b2c79b4d80\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.186486 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f786d16-8a6e-420b-b2b7-f785386e2191-config-data\") pod \"0f786d16-8a6e-420b-b2b7-f785386e2191\" (UID: \"0f786d16-8a6e-420b-b2b7-f785386e2191\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.186547 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/058e6f79-b92b-47d3-97ae-d588fec5efcd-operator-scripts\") pod \"058e6f79-b92b-47d3-97ae-d588fec5efcd\" (UID: \"058e6f79-b92b-47d3-97ae-d588fec5efcd\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.186596 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da94483d-f361-42ef-95b4-d4b2c79b4d80-logs\") pod \"da94483d-f361-42ef-95b4-d4b2c79b4d80\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.186616 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-config-data-custom\") pod \"da94483d-f361-42ef-95b4-d4b2c79b4d80\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.186622 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "eaef3c48-5e7c-4ea3-a2d0-da44ea528455" (UID: "eaef3c48-5e7c-4ea3-a2d0-da44ea528455"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.186688 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-combined-ca-bundle\") pod \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.186753 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-combined-ca-bundle\") pod \"da94483d-f361-42ef-95b4-d4b2c79b4d80\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.186775 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f786d16-8a6e-420b-b2b7-f785386e2191-combined-ca-bundle\") pod \"0f786d16-8a6e-420b-b2b7-f785386e2191\" (UID: \"0f786d16-8a6e-420b-b2b7-f785386e2191\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.186800 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-config-data\") pod \"da94483d-f361-42ef-95b4-d4b2c79b4d80\" (UID: \"da94483d-f361-42ef-95b4-d4b2c79b4d80\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.186819 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxpjs\" (UniqueName: \"kubernetes.io/projected/058e6f79-b92b-47d3-97ae-d588fec5efcd-kube-api-access-kxpjs\") pod \"058e6f79-b92b-47d3-97ae-d588fec5efcd\" (UID: \"058e6f79-b92b-47d3-97ae-d588fec5efcd\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.186856 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh4wv\" (UniqueName: \"kubernetes.io/projected/0f786d16-8a6e-420b-b2b7-f785386e2191-kube-api-access-xh4wv\") pod \"0f786d16-8a6e-420b-b2b7-f785386e2191\" (UID: \"0f786d16-8a6e-420b-b2b7-f785386e2191\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.186886 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-config-data\") pod \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.186934 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-config-data-custom\") pod \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.187026 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-scripts\") pod \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\" (UID: \"eaef3c48-5e7c-4ea3-a2d0-da44ea528455\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.187049 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "48a868ce-c1ab-457a-bd7f-224f8e982a13" (UID: "48a868ce-c1ab-457a-bd7f-224f8e982a13"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.187470 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4658eac8-46b0-448b-8bc7-7c783fcef1c6-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.187488 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d65ea6-5ea0-44fa-a4ab-82297d975a87-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.187499 4745 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.187521 4745 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.187532 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g74lk\" (UniqueName: \"kubernetes.io/projected/f8d65ea6-5ea0-44fa-a4ab-82297d975a87-kube-api-access-g74lk\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.187549 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.193441 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058e6f79-b92b-47d3-97ae-d588fec5efcd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "058e6f79-b92b-47d3-97ae-d588fec5efcd" (UID: "058e6f79-b92b-47d3-97ae-d588fec5efcd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.195235 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da94483d-f361-42ef-95b4-d4b2c79b4d80-logs" (OuterVolumeSpecName: "logs") pod "da94483d-f361-42ef-95b4-d4b2c79b4d80" (UID: "da94483d-f361-42ef-95b4-d4b2c79b4d80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.234447 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder686d-account-delete-s68kh" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.236561 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-config-data" (OuterVolumeSpecName: "config-data") pod "73ed8df4-e28f-4c76-bca1-a3d77ef789d4" (UID: "73ed8df4-e28f-4c76-bca1-a3d77ef789d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.271867 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da94483d-f361-42ef-95b4-d4b2c79b4d80" (UID: "da94483d-f361-42ef-95b4-d4b2c79b4d80"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.271871 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da94483d-f361-42ef-95b4-d4b2c79b4d80-kube-api-access-scr5f" (OuterVolumeSpecName: "kube-api-access-scr5f") pod "da94483d-f361-42ef-95b4-d4b2c79b4d80" (UID: "da94483d-f361-42ef-95b4-d4b2c79b4d80"). InnerVolumeSpecName "kube-api-access-scr5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.272062 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "eaef3c48-5e7c-4ea3-a2d0-da44ea528455" (UID: "eaef3c48-5e7c-4ea3-a2d0-da44ea528455"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.272083 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "73ed8df4-e28f-4c76-bca1-a3d77ef789d4" (UID: "73ed8df4-e28f-4c76-bca1-a3d77ef789d4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.272125 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-scripts" (OuterVolumeSpecName: "scripts") pod "eaef3c48-5e7c-4ea3-a2d0-da44ea528455" (UID: "eaef3c48-5e7c-4ea3-a2d0-da44ea528455"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.272503 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/058e6f79-b92b-47d3-97ae-d588fec5efcd-kube-api-access-kxpjs" (OuterVolumeSpecName: "kube-api-access-kxpjs") pod "058e6f79-b92b-47d3-97ae-d588fec5efcd" (UID: "058e6f79-b92b-47d3-97ae-d588fec5efcd"). InnerVolumeSpecName "kube-api-access-kxpjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.272689 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f786d16-8a6e-420b-b2b7-f785386e2191-kube-api-access-xh4wv" (OuterVolumeSpecName: "kube-api-access-xh4wv") pod "0f786d16-8a6e-420b-b2b7-f785386e2191" (UID: "0f786d16-8a6e-420b-b2b7-f785386e2191"). InnerVolumeSpecName "kube-api-access-xh4wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.273044 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-kube-api-access-4kvzz" (OuterVolumeSpecName: "kube-api-access-4kvzz") pod "eaef3c48-5e7c-4ea3-a2d0-da44ea528455" (UID: "eaef3c48-5e7c-4ea3-a2d0-da44ea528455"). InnerVolumeSpecName "kube-api-access-4kvzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.289164 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ca899aa-fd10-43e0-ae58-a3a4ae2a4066-operator-scripts\") pod \"8ca899aa-fd10-43e0-ae58-a3a4ae2a4066\" (UID: \"8ca899aa-fd10-43e0-ae58-a3a4ae2a4066\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.289275 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6qx5\" (UniqueName: \"kubernetes.io/projected/8ca899aa-fd10-43e0-ae58-a3a4ae2a4066-kube-api-access-k6qx5\") pod \"8ca899aa-fd10-43e0-ae58-a3a4ae2a4066\" (UID: \"8ca899aa-fd10-43e0-ae58-a3a4ae2a4066\") " Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.289957 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ca899aa-fd10-43e0-ae58-a3a4ae2a4066-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ca899aa-fd10-43e0-ae58-a3a4ae2a4066" (UID: "8ca899aa-fd10-43e0-ae58-a3a4ae2a4066"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.290264 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.290300 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kvzz\" (UniqueName: \"kubernetes.io/projected/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-kube-api-access-4kvzz\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.290322 4745 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.290341 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scr5f\" (UniqueName: \"kubernetes.io/projected/da94483d-f361-42ef-95b4-d4b2c79b4d80-kube-api-access-scr5f\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.290358 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/058e6f79-b92b-47d3-97ae-d588fec5efcd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.290373 4745 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da94483d-f361-42ef-95b4-d4b2c79b4d80-logs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.290387 4745 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.290400 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73ed8df4-e28f-4c76-bca1-a3d77ef789d4-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.290417 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxpjs\" (UniqueName: \"kubernetes.io/projected/058e6f79-b92b-47d3-97ae-d588fec5efcd-kube-api-access-kxpjs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.290433 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh4wv\" (UniqueName: \"kubernetes.io/projected/0f786d16-8a6e-420b-b2b7-f785386e2191-kube-api-access-xh4wv\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.290450 4745 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.329162 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ca899aa-fd10-43e0-ae58-a3a4ae2a4066-kube-api-access-k6qx5" (OuterVolumeSpecName: "kube-api-access-k6qx5") pod "8ca899aa-fd10-43e0-ae58-a3a4ae2a4066" (UID: "8ca899aa-fd10-43e0-ae58-a3a4ae2a4066"). InnerVolumeSpecName "kube-api-access-k6qx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.334025 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0f786d16-8a6e-420b-b2b7-f785386e2191","Type":"ContainerDied","Data":"0f99d043f80c96c0219013b5c63ecbffc0c921f477454b7d53e2643d3fd2ab70"} Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.334092 4745 scope.go:117] "RemoveContainer" containerID="709a10ef722e6bd115a3fbe87e5cae6688e5855a823df3b981535f201d001778" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.334207 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.337580 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"eaef3c48-5e7c-4ea3-a2d0-da44ea528455","Type":"ContainerDied","Data":"b2075f351e7ae8c27f3ebc434998594b8366224de945e6605e33ad1d8cc3d96d"} Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.337723 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.341823 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4957-account-delete-b6ngs" event={"ID":"1b87df09-ea26-4c97-bcd6-4ee7c6250d00","Type":"ContainerDied","Data":"dce8538aa9459efaa7837ae30be0b9745ee689a38c57c84e12eb58700786d6b5"} Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.341905 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dce8538aa9459efaa7837ae30be0b9745ee689a38c57c84e12eb58700786d6b5" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.343718 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder686d-account-delete-s68kh" event={"ID":"8ca899aa-fd10-43e0-ae58-a3a4ae2a4066","Type":"ContainerDied","Data":"e1cfa3f3cd436608cae8cfea37aad0942ba4e426a339a4ea953d4667b2cbe4ac"} Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.343740 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1cfa3f3cd436608cae8cfea37aad0942ba4e426a339a4ea953d4667b2cbe4ac" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.343796 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder686d-account-delete-s68kh" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.367933 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57876487f8-zgj8m" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.368277 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57876487f8-zgj8m" event={"ID":"600c553e-f8e5-4ec6-94e7-2981abc748cb","Type":"ContainerDied","Data":"867f5731f60fe6f590fb1dc5ac8c8b8bca5f8c58708b87e5c7dc1d2c9b28d1d1"} Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.369965 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c9e009b8-8eec-4028-ade9-84bc49d236c8" (UID: "c9e009b8-8eec-4028-ade9-84bc49d236c8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.372402 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" event={"ID":"4658eac8-46b0-448b-8bc7-7c783fcef1c6","Type":"ContainerDied","Data":"16d59b177c3820f47e3ee14727a18e7325b983db2fb60b44ac0a1caa054b9d35"} Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.372597 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-55565d45c6-5hsz5" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.379916 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron1fc4-account-delete-rqj56" event={"ID":"f8d65ea6-5ea0-44fa-a4ab-82297d975a87","Type":"ContainerDied","Data":"7258dfd2d9d91a044e051336a0dd28b89acfb4cee3d06bfe62c49ee36a381e57"} Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.379999 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7258dfd2d9d91a044e051336a0dd28b89acfb4cee3d06bfe62c49ee36a381e57" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.380085 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron1fc4-account-delete-rqj56" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.383622 4745 generic.go:334] "Generic (PLEG): container finished" podID="ceba626e-26d1-495f-b88d-fed69e445ddb" containerID="8dae8dcac1defac8c58cb335ba486b55e9b6076bf8990bbbc17743c619b726cb" exitCode=0 Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.383694 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ceba626e-26d1-495f-b88d-fed69e445ddb","Type":"ContainerDied","Data":"8dae8dcac1defac8c58cb335ba486b55e9b6076bf8990bbbc17743c619b726cb"} Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.385819 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cf679bcc-g65zr" event={"ID":"da94483d-f361-42ef-95b4-d4b2c79b4d80","Type":"ContainerDied","Data":"c018599f5942c8fc41469f65fb994bad0f7170c4d04647d6ff4ec90a9a80d783"} Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.385903 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cf679bcc-g65zr" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.388674 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement980e-account-delete-85kkl" event={"ID":"058e6f79-b92b-47d3-97ae-d588fec5efcd","Type":"ContainerDied","Data":"7ac766ca90f7a3dc12679f41b40e852c40e34f24b5ca71f1800c4c3e99d9cd84"} Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.388704 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ac766ca90f7a3dc12679f41b40e852c40e34f24b5ca71f1800c4c3e99d9cd84" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.388757 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement980e-account-delete-85kkl" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.393367 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ca899aa-fd10-43e0-ae58-a3a4ae2a4066-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.393404 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6qx5\" (UniqueName: \"kubernetes.io/projected/8ca899aa-fd10-43e0-ae58-a3a4ae2a4066-kube-api-access-k6qx5\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.393424 4745 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.393866 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "600c553e-f8e5-4ec6-94e7-2981abc748cb" (UID: "600c553e-f8e5-4ec6-94e7-2981abc748cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.399633 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73ed8df4-e28f-4c76-bca1-a3d77ef789d4","Type":"ContainerDied","Data":"3c85021515548f94941fc14712c22b63b3fc3882063840e3d2b2b60e120adcac"} Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.399756 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.424766 4745 generic.go:334] "Generic (PLEG): container finished" podID="84ab78e7-7419-4892-92c0-085db552be56" containerID="165fc14fd97a535329fc700bc5ee13ff68c82e069d90770b464ef7ed62a40419" exitCode=0 Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.424837 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"84ab78e7-7419-4892-92c0-085db552be56","Type":"ContainerDied","Data":"165fc14fd97a535329fc700bc5ee13ff68c82e069d90770b464ef7ed62a40419"} Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.434242 4745 generic.go:334] "Generic (PLEG): container finished" podID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerID="5fd672e48430bca4d1c2a71daa80ccf27bd2fcb9360ab8501dfd3906ec36e53e" exitCode=0 Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.434377 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone5bf8-account-delete-c4n86" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.434947 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41d93c20-efbc-41c3-bea1-7e7dad1ae70d","Type":"ContainerDied","Data":"5fd672e48430bca4d1c2a71daa80ccf27bd2fcb9360ab8501dfd3906ec36e53e"} Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.435089 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.436074 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.436342 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.450556 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "600c553e-f8e5-4ec6-94e7-2981abc748cb" (UID: "600c553e-f8e5-4ec6-94e7-2981abc748cb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.495346 4745 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.496383 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/600c553e-f8e5-4ec6-94e7-2981abc748cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.503039 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9e009b8-8eec-4028-ade9-84bc49d236c8" (UID: "c9e009b8-8eec-4028-ade9-84bc49d236c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.505973 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f786d16-8a6e-420b-b2b7-f785386e2191-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f786d16-8a6e-420b-b2b7-f785386e2191" (UID: "0f786d16-8a6e-420b-b2b7-f785386e2191"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.556069 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 11:57:52 crc kubenswrapper[4745]: E1209 11:57:52.558047 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.604559 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e009b8-8eec-4028-ade9-84bc49d236c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.604597 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f786d16-8a6e-420b-b2b7-f785386e2191-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.614761 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-config-data" (OuterVolumeSpecName: "config-data") pod "48a868ce-c1ab-457a-bd7f-224f8e982a13" (UID: "48a868ce-c1ab-457a-bd7f-224f8e982a13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.623775 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da94483d-f361-42ef-95b4-d4b2c79b4d80" (UID: "da94483d-f361-42ef-95b4-d4b2c79b4d80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.628745 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f786d16-8a6e-420b-b2b7-f785386e2191-config-data" (OuterVolumeSpecName: "config-data") pod "0f786d16-8a6e-420b-b2b7-f785386e2191" (UID: "0f786d16-8a6e-420b-b2b7-f785386e2191"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.659651 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-config-data" (OuterVolumeSpecName: "config-data") pod "da94483d-f361-42ef-95b4-d4b2c79b4d80" (UID: "da94483d-f361-42ef-95b4-d4b2c79b4d80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.706239 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7755562-a5b1-4b4d-833c-5a3179a2926c-operator-scripts\") pod \"keystone5bf8-account-delete-c4n86\" (UID: \"a7755562-a5b1-4b4d-833c-5a3179a2926c\") " pod="openstack/keystone5bf8-account-delete-c4n86" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.706363 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqq2v\" (UniqueName: \"kubernetes.io/projected/a7755562-a5b1-4b4d-833c-5a3179a2926c-kube-api-access-qqq2v\") pod \"keystone5bf8-account-delete-c4n86\" (UID: \"a7755562-a5b1-4b4d-833c-5a3179a2926c\") " pod="openstack/keystone5bf8-account-delete-c4n86" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.706483 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f786d16-8a6e-420b-b2b7-f785386e2191-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.706497 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.706547 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da94483d-f361-42ef-95b4-d4b2c79b4d80-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.706561 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a868ce-c1ab-457a-bd7f-224f8e982a13-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:52 crc kubenswrapper[4745]: E1209 11:57:52.707122 4745 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 11:57:52 crc kubenswrapper[4745]: E1209 11:57:52.707199 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7755562-a5b1-4b4d-833c-5a3179a2926c-operator-scripts podName:a7755562-a5b1-4b4d-833c-5a3179a2926c nodeName:}" failed. No retries permitted until 2025-12-09 11:57:54.70716531 +0000 UTC m=+1561.532366834 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a7755562-a5b1-4b4d-833c-5a3179a2926c-operator-scripts") pod "keystone5bf8-account-delete-c4n86" (UID: "a7755562-a5b1-4b4d-833c-5a3179a2926c") : configmap "openstack-scripts" not found Dec 09 11:57:52 crc kubenswrapper[4745]: E1209 11:57:52.715135 4745 projected.go:194] Error preparing data for projected volume kube-api-access-qqq2v for pod openstack/keystone5bf8-account-delete-c4n86: failed to fetch token: serviceaccounts "galera-openstack" not found Dec 09 11:57:52 crc kubenswrapper[4745]: E1209 11:57:52.715577 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a7755562-a5b1-4b4d-833c-5a3179a2926c-kube-api-access-qqq2v podName:a7755562-a5b1-4b4d-833c-5a3179a2926c nodeName:}" failed. No retries permitted until 2025-12-09 11:57:54.715549086 +0000 UTC m=+1561.540750610 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qqq2v" (UniqueName: "kubernetes.io/projected/a7755562-a5b1-4b4d-833c-5a3179a2926c-kube-api-access-qqq2v") pod "keystone5bf8-account-delete-c4n86" (UID: "a7755562-a5b1-4b4d-833c-5a3179a2926c") : failed to fetch token: serviceaccounts "galera-openstack" not found Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.729924 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eaef3c48-5e7c-4ea3-a2d0-da44ea528455" (UID: "eaef3c48-5e7c-4ea3-a2d0-da44ea528455"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:52 crc kubenswrapper[4745]: I1209 11:57:52.808773 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:52.845220 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-config-data" (OuterVolumeSpecName: "config-data") pod "eaef3c48-5e7c-4ea3-a2d0-da44ea528455" (UID: "eaef3c48-5e7c-4ea3-a2d0-da44ea528455"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:52.912825 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaef3c48-5e7c-4ea3-a2d0-da44ea528455-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.051618 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-4966q"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.066960 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-4966q"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.081650 4745 scope.go:117] "RemoveContainer" containerID="7a0d9d93ae67ba5d6ed408c67d268af4c670890fd02fafc91373ab63fe2182ca" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.082207 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron1fc4-account-delete-rqj56"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.097567 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance4957-account-delete-b6ngs" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.109994 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1fc4-account-create-update-qwf8c"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.133476 4745 scope.go:117] "RemoveContainer" containerID="85b41109811ea6cff92fc56baafb1484f3284cd5b7dfd75355171bc8db15a025" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.145728 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi03c2-account-delete-4b54q" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.159754 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.164566 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron1fc4-account-delete-rqj56"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.168606 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone5bf8-account-delete-c4n86" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.187883 4745 scope.go:117] "RemoveContainer" containerID="9b2f931241d0299ec3b05ca17d1893191def91b2abcc6ab8e43b6350e1923813" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.188098 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1fc4-account-create-update-qwf8c"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.192261 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican83f7-account-delete-5vbd2" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.199369 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-2kw4j"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.219326 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-2kw4j"] Dec 09 11:57:53 crc kubenswrapper[4745]: E1209 11:57:53.227665 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4362cdcbe28cc87f9a30f3b6d968ff0b65e8991acf95fb8eeec6d303cfb21ea1 is running failed: container process not found" containerID="4362cdcbe28cc87f9a30f3b6d968ff0b65e8991acf95fb8eeec6d303cfb21ea1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:57:53 crc kubenswrapper[4745]: E1209 11:57:53.229165 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4362cdcbe28cc87f9a30f3b6d968ff0b65e8991acf95fb8eeec6d303cfb21ea1 is running failed: container process not found" containerID="4362cdcbe28cc87f9a30f3b6d968ff0b65e8991acf95fb8eeec6d303cfb21ea1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:57:53 crc kubenswrapper[4745]: E1209 11:57:53.229669 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4362cdcbe28cc87f9a30f3b6d968ff0b65e8991acf95fb8eeec6d303cfb21ea1 is running failed: container process not found" containerID="4362cdcbe28cc87f9a30f3b6d968ff0b65e8991acf95fb8eeec6d303cfb21ea1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 11:57:53 crc kubenswrapper[4745]: E1209 11:57:53.229719 4745 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4362cdcbe28cc87f9a30f3b6d968ff0b65e8991acf95fb8eeec6d303cfb21ea1 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="392b878a-37ba-4887-a699-672c8b92e947" containerName="nova-scheduler-scheduler" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.250357 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0e536-account-delete-zsxfl" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.255737 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-combined-ca-bundle\") pod \"a506f944-5b99-48af-a714-e24782ba1c06\" (UID: \"a506f944-5b99-48af-a714-e24782ba1c06\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.255790 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b87df09-ea26-4c97-bcd6-4ee7c6250d00-operator-scripts\") pod \"1b87df09-ea26-4c97-bcd6-4ee7c6250d00\" (UID: \"1b87df09-ea26-4c97-bcd6-4ee7c6250d00\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.255848 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m49pp\" (UniqueName: \"kubernetes.io/projected/1b87df09-ea26-4c97-bcd6-4ee7c6250d00-kube-api-access-m49pp\") pod \"1b87df09-ea26-4c97-bcd6-4ee7c6250d00\" (UID: \"1b87df09-ea26-4c97-bcd6-4ee7c6250d00\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.255892 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-488h7\" (UniqueName: \"kubernetes.io/projected/0da87c5d-f709-4d9b-b182-421edbb61f00-kube-api-access-488h7\") pod \"0da87c5d-f709-4d9b-b182-421edbb61f00\" (UID: \"0da87c5d-f709-4d9b-b182-421edbb61f00\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.255909 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0da87c5d-f709-4d9b-b182-421edbb61f00-operator-scripts\") pod \"0da87c5d-f709-4d9b-b182-421edbb61f00\" (UID: \"0da87c5d-f709-4d9b-b182-421edbb61f00\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.255943 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-kube-state-metrics-tls-certs\") pod \"a506f944-5b99-48af-a714-e24782ba1c06\" (UID: \"a506f944-5b99-48af-a714-e24782ba1c06\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.263835 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0da87c5d-f709-4d9b-b182-421edbb61f00-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0da87c5d-f709-4d9b-b182-421edbb61f00" (UID: "0da87c5d-f709-4d9b-b182-421edbb61f00"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.266168 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfppw\" (UniqueName: \"kubernetes.io/projected/a506f944-5b99-48af-a714-e24782ba1c06-kube-api-access-zfppw\") pod \"a506f944-5b99-48af-a714-e24782ba1c06\" (UID: \"a506f944-5b99-48af-a714-e24782ba1c06\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.279322 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-kube-state-metrics-tls-config\") pod \"a506f944-5b99-48af-a714-e24782ba1c06\" (UID: \"a506f944-5b99-48af-a714-e24782ba1c06\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.280535 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0da87c5d-f709-4d9b-b182-421edbb61f00-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.269354 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.281744 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement980e-account-delete-85kkl"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.291596 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-980e-account-create-update-cllcd"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.296845 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b87df09-ea26-4c97-bcd6-4ee7c6250d00-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b87df09-ea26-4c97-bcd6-4ee7c6250d00" (UID: "1b87df09-ea26-4c97-bcd6-4ee7c6250d00"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.308875 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement980e-account-delete-85kkl"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.320649 4745 scope.go:117] "RemoveContainer" containerID="76e07b4d01b40ce6fe2153747a2637a847f6695c48e7781c037a25626d453609" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.328092 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b87df09-ea26-4c97-bcd6-4ee7c6250d00-kube-api-access-m49pp" (OuterVolumeSpecName: "kube-api-access-m49pp") pod "1b87df09-ea26-4c97-bcd6-4ee7c6250d00" (UID: "1b87df09-ea26-4c97-bcd6-4ee7c6250d00"). InnerVolumeSpecName "kube-api-access-m49pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.328594 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a506f944-5b99-48af-a714-e24782ba1c06-kube-api-access-zfppw" (OuterVolumeSpecName: "kube-api-access-zfppw") pod "a506f944-5b99-48af-a714-e24782ba1c06" (UID: "a506f944-5b99-48af-a714-e24782ba1c06"). InnerVolumeSpecName "kube-api-access-zfppw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.336086 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0da87c5d-f709-4d9b-b182-421edbb61f00-kube-api-access-488h7" (OuterVolumeSpecName: "kube-api-access-488h7") pod "0da87c5d-f709-4d9b-b182-421edbb61f00" (UID: "0da87c5d-f709-4d9b-b182-421edbb61f00"). InnerVolumeSpecName "kube-api-access-488h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.340068 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-980e-account-create-update-cllcd"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.357041 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a506f944-5b99-48af-a714-e24782ba1c06" (UID: "a506f944-5b99-48af-a714-e24782ba1c06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.360567 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.394243 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bwb6\" (UniqueName: \"kubernetes.io/projected/88802adc-d164-420b-98d2-a757b6627350-kube-api-access-7bwb6\") pod \"88802adc-d164-420b-98d2-a757b6627350\" (UID: \"88802adc-d164-420b-98d2-a757b6627350\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.394494 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6q64\" (UniqueName: \"kubernetes.io/projected/e822e9c0-d6fa-4880-a0e3-8dfb32405a6f-kube-api-access-c6q64\") pod \"e822e9c0-d6fa-4880-a0e3-8dfb32405a6f\" (UID: \"e822e9c0-d6fa-4880-a0e3-8dfb32405a6f\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.394782 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88802adc-d164-420b-98d2-a757b6627350-operator-scripts\") pod \"88802adc-d164-420b-98d2-a757b6627350\" (UID: \"88802adc-d164-420b-98d2-a757b6627350\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.394899 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e822e9c0-d6fa-4880-a0e3-8dfb32405a6f-operator-scripts\") pod \"e822e9c0-d6fa-4880-a0e3-8dfb32405a6f\" (UID: \"e822e9c0-d6fa-4880-a0e3-8dfb32405a6f\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.397704 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e822e9c0-d6fa-4880-a0e3-8dfb32405a6f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e822e9c0-d6fa-4880-a0e3-8dfb32405a6f" (UID: "e822e9c0-d6fa-4880-a0e3-8dfb32405a6f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.398145 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88802adc-d164-420b-98d2-a757b6627350-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "88802adc-d164-420b-98d2-a757b6627350" (UID: "88802adc-d164-420b-98d2-a757b6627350"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.403483 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "a506f944-5b99-48af-a714-e24782ba1c06" (UID: "a506f944-5b99-48af-a714-e24782ba1c06"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.403618 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88802adc-d164-420b-98d2-a757b6627350-kube-api-access-7bwb6" (OuterVolumeSpecName: "kube-api-access-7bwb6") pod "88802adc-d164-420b-98d2-a757b6627350" (UID: "88802adc-d164-420b-98d2-a757b6627350"). InnerVolumeSpecName "kube-api-access-7bwb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.423829 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.425336 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfppw\" (UniqueName: \"kubernetes.io/projected/a506f944-5b99-48af-a714-e24782ba1c06-kube-api-access-zfppw\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.425434 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.425599 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b87df09-ea26-4c97-bcd6-4ee7c6250d00-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.425619 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m49pp\" (UniqueName: \"kubernetes.io/projected/1b87df09-ea26-4c97-bcd6-4ee7c6250d00-kube-api-access-m49pp\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.425639 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-488h7\" (UniqueName: \"kubernetes.io/projected/0da87c5d-f709-4d9b-b182-421edbb61f00-kube-api-access-488h7\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.426312 4745 scope.go:117] "RemoveContainer" containerID="84c6b918c9652ae6342e8c33b38c40a7337bb5caf0afd25ca43e70b2c29d044c" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.440308 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "a506f944-5b99-48af-a714-e24782ba1c06" (UID: "a506f944-5b99-48af-a714-e24782ba1c06"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.462586 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e822e9c0-d6fa-4880-a0e3-8dfb32405a6f-kube-api-access-c6q64" (OuterVolumeSpecName: "kube-api-access-c6q64") pod "e822e9c0-d6fa-4880-a0e3-8dfb32405a6f" (UID: "e822e9c0-d6fa-4880-a0e3-8dfb32405a6f"). InnerVolumeSpecName "kube-api-access-c6q64". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.466889 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.496177 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ceba626e-26d1-495f-b88d-fed69e445ddb","Type":"ContainerDied","Data":"e2bad3265f0201a0d7ec8975a55822f12482128226d12f4802d071a66515b01e"} Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.496390 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.507443 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-xl2mt"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.512550 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0e536-account-delete-zsxfl" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.513962 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0e536-account-delete-zsxfl" event={"ID":"e822e9c0-d6fa-4880-a0e3-8dfb32405a6f","Type":"ContainerDied","Data":"0565c02dccb4ed507911cb3e1400e46707346fe6909e6a5ffcde4b1ce1ff7197"} Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.514024 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0565c02dccb4ed507911cb3e1400e46707346fe6909e6a5ffcde4b1ce1ff7197" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.528782 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392b878a-37ba-4887-a699-672c8b92e947-combined-ca-bundle\") pod \"392b878a-37ba-4887-a699-672c8b92e947\" (UID: \"392b878a-37ba-4887-a699-672c8b92e947\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.528935 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm98c\" (UniqueName: \"kubernetes.io/projected/392b878a-37ba-4887-a699-672c8b92e947-kube-api-access-mm98c\") pod \"392b878a-37ba-4887-a699-672c8b92e947\" (UID: \"392b878a-37ba-4887-a699-672c8b92e947\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.528965 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392b878a-37ba-4887-a699-672c8b92e947-config-data\") pod \"392b878a-37ba-4887-a699-672c8b92e947\" (UID: \"392b878a-37ba-4887-a699-672c8b92e947\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.529355 4745 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.529375 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6q64\" (UniqueName: \"kubernetes.io/projected/e822e9c0-d6fa-4880-a0e3-8dfb32405a6f-kube-api-access-c6q64\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.529385 4745 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a506f944-5b99-48af-a714-e24782ba1c06-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.529398 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88802adc-d164-420b-98d2-a757b6627350-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.529410 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e822e9c0-d6fa-4880-a0e3-8dfb32405a6f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.529424 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bwb6\" (UniqueName: \"kubernetes.io/projected/88802adc-d164-420b-98d2-a757b6627350-kube-api-access-7bwb6\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.535210 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392b878a-37ba-4887-a699-672c8b92e947-kube-api-access-mm98c" (OuterVolumeSpecName: "kube-api-access-mm98c") pod "392b878a-37ba-4887-a699-672c8b92e947" (UID: "392b878a-37ba-4887-a699-672c8b92e947"). InnerVolumeSpecName "kube-api-access-mm98c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.535277 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-xl2mt"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.540816 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.565919 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.591280 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.591643 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi03c2-account-delete-4b54q" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.592403 4745 scope.go:117] "RemoveContainer" containerID="8c3a126676cf77a7477f9d5072236397973a255f712333be799d5fe817503fba" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.597221 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392b878a-37ba-4887-a699-672c8b92e947-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "392b878a-37ba-4887-a699-672c8b92e947" (UID: "392b878a-37ba-4887-a699-672c8b92e947"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.608081 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392b878a-37ba-4887-a699-672c8b92e947-config-data" (OuterVolumeSpecName: "config-data") pod "392b878a-37ba-4887-a699-672c8b92e947" (UID: "392b878a-37ba-4887-a699-672c8b92e947"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.619089 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.622047 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00d4e762-f4d9-4409-9051-d396157c0e90" path="/var/lib/kubelet/pods/00d4e762-f4d9-4409-9051-d396157c0e90/volumes" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.622919 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0452bd55-b0a3-46a9-a388-6db2e40f4cb7" path="/var/lib/kubelet/pods/0452bd55-b0a3-46a9-a388-6db2e40f4cb7/volumes" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.624726 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058e6f79-b92b-47d3-97ae-d588fec5efcd" path="/var/lib/kubelet/pods/058e6f79-b92b-47d3-97ae-d588fec5efcd/volumes" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.625604 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2319532e-1ba5-4616-8363-16f109438bd8" path="/var/lib/kubelet/pods/2319532e-1ba5-4616-8363-16f109438bd8/volumes" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.627278 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5257a699-8e01-4460-9764-ae38c984495e" path="/var/lib/kubelet/pods/5257a699-8e01-4460-9764-ae38c984495e/volumes" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.629218 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e009b8-8eec-4028-ade9-84bc49d236c8" path="/var/lib/kubelet/pods/c9e009b8-8eec-4028-ade9-84bc49d236c8/volumes" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.630065 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca2927f7-0512-4059-8f10-487ac6fbbad0" path="/var/lib/kubelet/pods/ca2927f7-0512-4059-8f10-487ac6fbbad0/volumes" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.630469 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ceba626e-26d1-495f-b88d-fed69e445ddb\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.630580 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kts6\" (UniqueName: \"kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-kube-api-access-8kts6\") pod \"ceba626e-26d1-495f-b88d-fed69e445ddb\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.630658 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-tls\") pod \"ceba626e-26d1-495f-b88d-fed69e445ddb\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.630696 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-plugins\") pod \"ceba626e-26d1-495f-b88d-fed69e445ddb\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.630715 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0683710-3d66-4128-981a-227590aa97a0" path="/var/lib/kubelet/pods/d0683710-3d66-4128-981a-227590aa97a0/volumes" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.630720 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-confd\") pod \"ceba626e-26d1-495f-b88d-fed69e445ddb\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.631651 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-server-conf\") pod \"ceba626e-26d1-495f-b88d-fed69e445ddb\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.631730 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-erlang-cookie\") pod \"ceba626e-26d1-495f-b88d-fed69e445ddb\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.631759 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ceba626e-26d1-495f-b88d-fed69e445ddb-erlang-cookie-secret\") pod \"ceba626e-26d1-495f-b88d-fed69e445ddb\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.631874 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-config-data\") pod \"ceba626e-26d1-495f-b88d-fed69e445ddb\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.632022 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-plugins-conf\") pod \"ceba626e-26d1-495f-b88d-fed69e445ddb\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.632113 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ceba626e-26d1-495f-b88d-fed69e445ddb-pod-info\") pod \"ceba626e-26d1-495f-b88d-fed69e445ddb\" (UID: \"ceba626e-26d1-495f-b88d-fed69e445ddb\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.632132 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8d65ea6-5ea0-44fa-a4ab-82297d975a87" path="/var/lib/kubelet/pods/f8d65ea6-5ea0-44fa-a4ab-82297d975a87/volumes" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.634107 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392b878a-37ba-4887-a699-672c8b92e947-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.634200 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm98c\" (UniqueName: \"kubernetes.io/projected/392b878a-37ba-4887-a699-672c8b92e947-kube-api-access-mm98c\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.634256 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392b878a-37ba-4887-a699-672c8b92e947-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.637290 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ceba626e-26d1-495f-b88d-fed69e445ddb" (UID: "ceba626e-26d1-495f-b88d-fed69e445ddb"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.640571 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.646767 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "ceba626e-26d1-495f-b88d-fed69e445ddb" (UID: "ceba626e-26d1-495f-b88d-fed69e445ddb"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.651105 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ceba626e-26d1-495f-b88d-fed69e445ddb-pod-info" (OuterVolumeSpecName: "pod-info") pod "ceba626e-26d1-495f-b88d-fed69e445ddb" (UID: "ceba626e-26d1-495f-b88d-fed69e445ddb"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.651899 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ceba626e-26d1-495f-b88d-fed69e445ddb" (UID: "ceba626e-26d1-495f-b88d-fed69e445ddb"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.657772 4745 scope.go:117] "RemoveContainer" containerID="09651a3667881d281a4da0af523c62fd762af27f395c545a4af9df337fa048bd" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.658373 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"392b878a-37ba-4887-a699-672c8b92e947","Type":"ContainerDied","Data":"7adfd565aa074e4b03e26470a07850ac0acc8fcf3b826ab83943ab8c20b30f75"} Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.658418 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi03c2-account-delete-4b54q" event={"ID":"0da87c5d-f709-4d9b-b182-421edbb61f00","Type":"ContainerDied","Data":"c5fb98f1a64a8c7231b9ecbecc3a768da83bc7e9f003e31f3af3d1abc4bc52b1"} Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.658439 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5fb98f1a64a8c7231b9ecbecc3a768da83bc7e9f003e31f3af3d1abc4bc52b1" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.658451 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-686d-account-create-update-lkjxb"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.658468 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder686d-account-delete-s68kh"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.658485 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-686d-account-create-update-lkjxb"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.658523 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"84ab78e7-7419-4892-92c0-085db552be56","Type":"ContainerDied","Data":"7040f68d397bde83ed93252b05cd7771f9c0d79e8883478a2cf4d910f8f5a280"} Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.658540 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a506f944-5b99-48af-a714-e24782ba1c06","Type":"ContainerDied","Data":"75f05b692bee59a7c8306779305f38ca46ce05f0854a343e4dc5681433e2f1d8"} Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.658556 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder686d-account-delete-s68kh"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.658569 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7cf679bcc-g65zr"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.658777 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ceba626e-26d1-495f-b88d-fed69e445ddb" (UID: "ceba626e-26d1-495f-b88d-fed69e445ddb"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.658884 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-kube-api-access-8kts6" (OuterVolumeSpecName: "kube-api-access-8kts6") pod "ceba626e-26d1-495f-b88d-fed69e445ddb" (UID: "ceba626e-26d1-495f-b88d-fed69e445ddb"). InnerVolumeSpecName "kube-api-access-8kts6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.661812 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ceba626e-26d1-495f-b88d-fed69e445ddb" (UID: "ceba626e-26d1-495f-b88d-fed69e445ddb"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.661863 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceba626e-26d1-495f-b88d-fed69e445ddb-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ceba626e-26d1-495f-b88d-fed69e445ddb" (UID: "ceba626e-26d1-495f-b88d-fed69e445ddb"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.667033 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican83f7-account-delete-5vbd2" event={"ID":"88802adc-d164-420b-98d2-a757b6627350","Type":"ContainerDied","Data":"3988406d5ae9fe980ddde74bc971cdf6095b3566ba793301f7161d913834d4be"} Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.667097 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3988406d5ae9fe980ddde74bc971cdf6095b3566ba793301f7161d913834d4be" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.667178 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican83f7-account-delete-5vbd2" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.680270 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7cf679bcc-g65zr"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.701893 4745 generic.go:334] "Generic (PLEG): container finished" podID="5b860955-30eb-40e6-bd56-caf6098aed8a" containerID="31bf789bc4e44645e0b834e648dabfc8f57c6ad93e2976fce50cb6120b8850cf" exitCode=0 Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.702039 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone5bf8-account-delete-c4n86" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.702103 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b860955-30eb-40e6-bd56-caf6098aed8a","Type":"ContainerDied","Data":"31bf789bc4e44645e0b834e648dabfc8f57c6ad93e2976fce50cb6120b8850cf"} Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.720618 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.720719 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance4957-account-delete-b6ngs" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.731438 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.737013 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grjnl\" (UniqueName: \"kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-kube-api-access-grjnl\") pod \"5b860955-30eb-40e6-bd56-caf6098aed8a\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.737127 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-erlang-cookie\") pod \"5b860955-30eb-40e6-bd56-caf6098aed8a\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.737184 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-plugins-conf\") pod \"5b860955-30eb-40e6-bd56-caf6098aed8a\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.737230 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-config-data\") pod \"5b860955-30eb-40e6-bd56-caf6098aed8a\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.737303 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84ab78e7-7419-4892-92c0-085db552be56-config-data\") pod \"84ab78e7-7419-4892-92c0-085db552be56\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.737339 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn9w4\" (UniqueName: \"kubernetes.io/projected/84ab78e7-7419-4892-92c0-085db552be56-kube-api-access-xn9w4\") pod \"84ab78e7-7419-4892-92c0-085db552be56\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.737370 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ab78e7-7419-4892-92c0-085db552be56-memcached-tls-certs\") pod \"84ab78e7-7419-4892-92c0-085db552be56\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.737464 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ab78e7-7419-4892-92c0-085db552be56-combined-ca-bundle\") pod \"84ab78e7-7419-4892-92c0-085db552be56\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.737489 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-tls\") pod \"5b860955-30eb-40e6-bd56-caf6098aed8a\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.737531 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84ab78e7-7419-4892-92c0-085db552be56-kolla-config\") pod \"84ab78e7-7419-4892-92c0-085db552be56\" (UID: \"84ab78e7-7419-4892-92c0-085db552be56\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.737550 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-confd\") pod \"5b860955-30eb-40e6-bd56-caf6098aed8a\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.737613 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b860955-30eb-40e6-bd56-caf6098aed8a-pod-info\") pod \"5b860955-30eb-40e6-bd56-caf6098aed8a\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.737647 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-server-conf\") pod \"5b860955-30eb-40e6-bd56-caf6098aed8a\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.737685 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b860955-30eb-40e6-bd56-caf6098aed8a-erlang-cookie-secret\") pod \"5b860955-30eb-40e6-bd56-caf6098aed8a\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.737704 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"5b860955-30eb-40e6-bd56-caf6098aed8a\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.737736 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-plugins\") pod \"5b860955-30eb-40e6-bd56-caf6098aed8a\" (UID: \"5b860955-30eb-40e6-bd56-caf6098aed8a\") " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.738138 4745 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.738155 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kts6\" (UniqueName: \"kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-kube-api-access-8kts6\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.738168 4745 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.738178 4745 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.738188 4745 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.738199 4745 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ceba626e-26d1-495f-b88d-fed69e445ddb-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.738209 4745 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.738219 4745 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ceba626e-26d1-495f-b88d-fed69e445ddb-pod-info\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.741909 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5b860955-30eb-40e6-bd56-caf6098aed8a" (UID: "5b860955-30eb-40e6-bd56-caf6098aed8a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.741933 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5b860955-30eb-40e6-bd56-caf6098aed8a" (UID: "5b860955-30eb-40e6-bd56-caf6098aed8a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.742263 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5b860955-30eb-40e6-bd56-caf6098aed8a" (UID: "5b860955-30eb-40e6-bd56-caf6098aed8a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.744061 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84ab78e7-7419-4892-92c0-085db552be56-config-data" (OuterVolumeSpecName: "config-data") pod "84ab78e7-7419-4892-92c0-085db552be56" (UID: "84ab78e7-7419-4892-92c0-085db552be56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.745834 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-config-data" (OuterVolumeSpecName: "config-data") pod "ceba626e-26d1-495f-b88d-fed69e445ddb" (UID: "ceba626e-26d1-495f-b88d-fed69e445ddb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.745927 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.746865 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84ab78e7-7419-4892-92c0-085db552be56-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "84ab78e7-7419-4892-92c0-085db552be56" (UID: "84ab78e7-7419-4892-92c0-085db552be56"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.749778 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-server-conf" (OuterVolumeSpecName: "server-conf") pod "ceba626e-26d1-495f-b88d-fed69e445ddb" (UID: "ceba626e-26d1-495f-b88d-fed69e445ddb"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.761737 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84ab78e7-7419-4892-92c0-085db552be56-kube-api-access-xn9w4" (OuterVolumeSpecName: "kube-api-access-xn9w4") pod "84ab78e7-7419-4892-92c0-085db552be56" (UID: "84ab78e7-7419-4892-92c0-085db552be56"). InnerVolumeSpecName "kube-api-access-xn9w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.761952 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5b860955-30eb-40e6-bd56-caf6098aed8a" (UID: "5b860955-30eb-40e6-bd56-caf6098aed8a"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.763897 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b860955-30eb-40e6-bd56-caf6098aed8a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5b860955-30eb-40e6-bd56-caf6098aed8a" (UID: "5b860955-30eb-40e6-bd56-caf6098aed8a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.764009 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-kube-api-access-grjnl" (OuterVolumeSpecName: "kube-api-access-grjnl") pod "5b860955-30eb-40e6-bd56-caf6098aed8a" (UID: "5b860955-30eb-40e6-bd56-caf6098aed8a"). InnerVolumeSpecName "kube-api-access-grjnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.770232 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-config-data" (OuterVolumeSpecName: "config-data") pod "5b860955-30eb-40e6-bd56-caf6098aed8a" (UID: "5b860955-30eb-40e6-bd56-caf6098aed8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.778432 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-55565d45c6-5hsz5"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.779836 4745 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.788223 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84ab78e7-7419-4892-92c0-085db552be56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84ab78e7-7419-4892-92c0-085db552be56" (UID: "84ab78e7-7419-4892-92c0-085db552be56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.792833 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5b860955-30eb-40e6-bd56-caf6098aed8a-pod-info" (OuterVolumeSpecName: "pod-info") pod "5b860955-30eb-40e6-bd56-caf6098aed8a" (UID: "5b860955-30eb-40e6-bd56-caf6098aed8a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.793716 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "5b860955-30eb-40e6-bd56-caf6098aed8a" (UID: "5b860955-30eb-40e6-bd56-caf6098aed8a"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.818741 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-55565d45c6-5hsz5"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.841569 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.841614 4745 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b860955-30eb-40e6-bd56-caf6098aed8a-pod-info\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.841625 4745 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b860955-30eb-40e6-bd56-caf6098aed8a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.841661 4745 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.841691 4745 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.841701 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grjnl\" (UniqueName: \"kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-kube-api-access-grjnl\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.841709 4745 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.841718 4745 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.841726 4745 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.841734 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.841743 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84ab78e7-7419-4892-92c0-085db552be56-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.841769 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn9w4\" (UniqueName: \"kubernetes.io/projected/84ab78e7-7419-4892-92c0-085db552be56-kube-api-access-xn9w4\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.841778 4745 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ceba626e-26d1-495f-b88d-fed69e445ddb-server-conf\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.841786 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ab78e7-7419-4892-92c0-085db552be56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.841798 4745 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.841808 4745 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84ab78e7-7419-4892-92c0-085db552be56-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.854172 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-server-conf" (OuterVolumeSpecName: "server-conf") pod "5b860955-30eb-40e6-bd56-caf6098aed8a" (UID: "5b860955-30eb-40e6-bd56-caf6098aed8a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.858906 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84ab78e7-7419-4892-92c0-085db552be56-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "84ab78e7-7419-4892-92c0-085db552be56" (UID: "84ab78e7-7419-4892-92c0-085db552be56"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.859643 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.869288 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.882731 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.891860 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.892777 4745 scope.go:117] "RemoveContainer" containerID="247fcd5449ca5281d6d3e9c9cde028019450ce328caca6d4d39e7b414171ced2" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.897941 4745 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.899877 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-rfdgn"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.902138 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ceba626e-26d1-495f-b88d-fed69e445ddb" (UID: "ceba626e-26d1-495f-b88d-fed69e445ddb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.906384 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-rfdgn"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.924013 4745 scope.go:117] "RemoveContainer" containerID="a9d2e5091d7aefc5b0d929913437d6ded33f50f234ba6c80915715daf68d74db" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.925261 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5b860955-30eb-40e6-bd56-caf6098aed8a" (UID: "5b860955-30eb-40e6-bd56-caf6098aed8a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.930584 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-4957-account-create-update-mgqk9"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.940945 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance4957-account-delete-b6ngs"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.943933 4745 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b860955-30eb-40e6-bd56-caf6098aed8a-server-conf\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.943969 4745 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.943979 4745 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ab78e7-7419-4892-92c0-085db552be56-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.943989 4745 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ceba626e-26d1-495f-b88d-fed69e445ddb-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.943998 4745 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b860955-30eb-40e6-bd56-caf6098aed8a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.950725 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4957-account-create-update-mgqk9"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.960076 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance4957-account-delete-b6ngs"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.972200 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-57876487f8-zgj8m"] Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.975743 4745 scope.go:117] "RemoveContainer" containerID="892538c48ca05829a077bcf325c90ee4bb55781acc43871ab0ffe0358d7af1b9" Dec 09 11:57:53 crc kubenswrapper[4745]: I1209 11:57:53.978924 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-57876487f8-zgj8m"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.012760 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.016032 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8b9f661d-4261-4e54-883d-cb0e7479a3d2/ovn-northd/0.log" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.016474 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.031504 4745 scope.go:117] "RemoveContainer" containerID="8dae8dcac1defac8c58cb335ba486b55e9b6076bf8990bbbc17743c619b726cb" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.031869 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.066960 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.075557 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.077778 4745 scope.go:117] "RemoveContainer" containerID="4b5baf1df440151578f138c2907e5ac84a040ebc7d009362d8e110c6311812ee" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.150654 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-qh9lk"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.151607 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-ovn-northd-tls-certs\") pod \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.151704 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-combined-ca-bundle\") pod \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.151810 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2rpl\" (UniqueName: \"kubernetes.io/projected/8b9f661d-4261-4e54-883d-cb0e7479a3d2-kube-api-access-p2rpl\") pod \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.151860 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8b9f661d-4261-4e54-883d-cb0e7479a3d2-ovn-rundir\") pod \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.151898 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f661d-4261-4e54-883d-cb0e7479a3d2-scripts\") pod \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.151931 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-metrics-certs-tls-certs\") pod \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.152008 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9f661d-4261-4e54-883d-cb0e7479a3d2-config\") pod \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\" (UID: \"8b9f661d-4261-4e54-883d-cb0e7479a3d2\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.153023 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9f661d-4261-4e54-883d-cb0e7479a3d2-config" (OuterVolumeSpecName: "config") pod "8b9f661d-4261-4e54-883d-cb0e7479a3d2" (UID: "8b9f661d-4261-4e54-883d-cb0e7479a3d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.154915 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b9f661d-4261-4e54-883d-cb0e7479a3d2-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "8b9f661d-4261-4e54-883d-cb0e7479a3d2" (UID: "8b9f661d-4261-4e54-883d-cb0e7479a3d2"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.170635 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-qh9lk"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.176847 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9f661d-4261-4e54-883d-cb0e7479a3d2-scripts" (OuterVolumeSpecName: "scripts") pod "8b9f661d-4261-4e54-883d-cb0e7479a3d2" (UID: "8b9f661d-4261-4e54-883d-cb0e7479a3d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.179404 4745 scope.go:117] "RemoveContainer" containerID="4362cdcbe28cc87f9a30f3b6d968ff0b65e8991acf95fb8eeec6d303cfb21ea1" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.181019 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-83f7-account-create-update-t6hf9"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.187400 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b9f661d-4261-4e54-883d-cb0e7479a3d2" (UID: "8b9f661d-4261-4e54-883d-cb0e7479a3d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.189006 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9f661d-4261-4e54-883d-cb0e7479a3d2-kube-api-access-p2rpl" (OuterVolumeSpecName: "kube-api-access-p2rpl") pod "8b9f661d-4261-4e54-883d-cb0e7479a3d2" (UID: "8b9f661d-4261-4e54-883d-cb0e7479a3d2"). InnerVolumeSpecName "kube-api-access-p2rpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.191198 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican83f7-account-delete-5vbd2"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.207650 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-83f7-account-create-update-t6hf9"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.226631 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican83f7-account-delete-5vbd2"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.253678 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.253723 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2rpl\" (UniqueName: \"kubernetes.io/projected/8b9f661d-4261-4e54-883d-cb0e7479a3d2-kube-api-access-p2rpl\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.253735 4745 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8b9f661d-4261-4e54-883d-cb0e7479a3d2-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.253745 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f661d-4261-4e54-883d-cb0e7479a3d2-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.253756 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9f661d-4261-4e54-883d-cb0e7479a3d2-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.272566 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "8b9f661d-4261-4e54-883d-cb0e7479a3d2" (UID: "8b9f661d-4261-4e54-883d-cb0e7479a3d2"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.272642 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.294360 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.326453 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.334686 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "8b9f661d-4261-4e54-883d-cb0e7479a3d2" (UID: "8b9f661d-4261-4e54-883d-cb0e7479a3d2"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.334880 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.355900 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.355936 4745 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b9f661d-4261-4e54-883d-cb0e7479a3d2-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.357350 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone5bf8-account-delete-c4n86"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.366296 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone5bf8-account-delete-c4n86"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.374109 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.381212 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.397452 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.412274 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.421285 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-mndrs"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.446584 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-mndrs"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.457764 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7755562-a5b1-4b4d-833c-5a3179a2926c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.457813 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqq2v\" (UniqueName: \"kubernetes.io/projected/a7755562-a5b1-4b4d-833c-5a3179a2926c-kube-api-access-qqq2v\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.459383 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-03c2-account-create-update-4546z"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.468214 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi03c2-account-delete-4b54q"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.474114 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-03c2-account-create-update-4546z"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.481131 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi03c2-account-delete-4b54q"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.487846 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.497003 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.503170 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qq6pt"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.511785 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qq6pt"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.518863 4745 scope.go:117] "RemoveContainer" containerID="165fc14fd97a535329fc700bc5ee13ff68c82e069d90770b464ef7ed62a40419" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.525696 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-e536-account-create-update-bnmp6"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.535806 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-e536-account-create-update-bnmp6"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.543236 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0e536-account-delete-zsxfl"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.557012 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell0e536-account-delete-zsxfl"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.577102 4745 scope.go:117] "RemoveContainer" containerID="84094740573a16863643bdc2da525be2a9ddb0b73b675078b658770516cc0d5e" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.656858 4745 scope.go:117] "RemoveContainer" containerID="31bf789bc4e44645e0b834e648dabfc8f57c6ad93e2976fce50cb6120b8850cf" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.678404 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.705424 4745 scope.go:117] "RemoveContainer" containerID="3183cfb66289decc58035a93ed3c6db16f62f8769a1980e9dea7b5da9edf7ad3" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.752176 4745 generic.go:334] "Generic (PLEG): container finished" podID="ff59337d-f366-446b-9752-eb371ee468e4" containerID="e26931e675e3e053e3c67af648d2433105486188c6e62356a6e8b58996656df4" exitCode=0 Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.752266 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56c9f47db8-wmdgz" event={"ID":"ff59337d-f366-446b-9752-eb371ee468e4","Type":"ContainerDied","Data":"e26931e675e3e053e3c67af648d2433105486188c6e62356a6e8b58996656df4"} Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.761935 4745 generic.go:334] "Generic (PLEG): container finished" podID="94a3f188-f451-4895-b500-52a9f7877d00" containerID="38929371712d3d60dc685cf4a33c96454c4b248c83eff828bff29196ac2964c6" exitCode=0 Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.762026 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"94a3f188-f451-4895-b500-52a9f7877d00","Type":"ContainerDied","Data":"38929371712d3d60dc685cf4a33c96454c4b248c83eff828bff29196ac2964c6"} Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.764261 4745 generic.go:334] "Generic (PLEG): container finished" podID="5d388089-75a9-4e64-8fcf-575fde454708" containerID="5e8c79b90f4d2a11dc79650cd3c734e513694255f605827a249017e68a0d2eae" exitCode=0 Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.764331 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.764341 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5d388089-75a9-4e64-8fcf-575fde454708","Type":"ContainerDied","Data":"5e8c79b90f4d2a11dc79650cd3c734e513694255f605827a249017e68a0d2eae"} Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.764377 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5d388089-75a9-4e64-8fcf-575fde454708","Type":"ContainerDied","Data":"35f4c5a9c3929502edb9496ad564801a9c47af1cf371c60d7f51177b9e941d10"} Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.764402 4745 scope.go:117] "RemoveContainer" containerID="5e8c79b90f4d2a11dc79650cd3c734e513694255f605827a249017e68a0d2eae" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.764923 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn8g6\" (UniqueName: \"kubernetes.io/projected/5d388089-75a9-4e64-8fcf-575fde454708-kube-api-access-rn8g6\") pod \"5d388089-75a9-4e64-8fcf-575fde454708\" (UID: \"5d388089-75a9-4e64-8fcf-575fde454708\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.765001 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d388089-75a9-4e64-8fcf-575fde454708-config-data\") pod \"5d388089-75a9-4e64-8fcf-575fde454708\" (UID: \"5d388089-75a9-4e64-8fcf-575fde454708\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.765042 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d388089-75a9-4e64-8fcf-575fde454708-combined-ca-bundle\") pod \"5d388089-75a9-4e64-8fcf-575fde454708\" (UID: \"5d388089-75a9-4e64-8fcf-575fde454708\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.770811 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d388089-75a9-4e64-8fcf-575fde454708-kube-api-access-rn8g6" (OuterVolumeSpecName: "kube-api-access-rn8g6") pod "5d388089-75a9-4e64-8fcf-575fde454708" (UID: "5d388089-75a9-4e64-8fcf-575fde454708"). InnerVolumeSpecName "kube-api-access-rn8g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.771353 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8b9f661d-4261-4e54-883d-cb0e7479a3d2/ovn-northd/0.log" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.771396 4745 generic.go:334] "Generic (PLEG): container finished" podID="8b9f661d-4261-4e54-883d-cb0e7479a3d2" containerID="f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c" exitCode=139 Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.771458 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8b9f661d-4261-4e54-883d-cb0e7479a3d2","Type":"ContainerDied","Data":"f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c"} Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.771489 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8b9f661d-4261-4e54-883d-cb0e7479a3d2","Type":"ContainerDied","Data":"367e9fb43e0608bca3ac04753c11274f0fa57e93d77bd613914c60ca5b8536b0"} Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.771567 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 09 11:57:54 crc kubenswrapper[4745]: E1209 11:57:54.789964 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d388089-75a9-4e64-8fcf-575fde454708-config-data podName:5d388089-75a9-4e64-8fcf-575fde454708 nodeName:}" failed. No retries permitted until 2025-12-09 11:57:55.289928027 +0000 UTC m=+1562.115129551 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/5d388089-75a9-4e64-8fcf-575fde454708-config-data") pod "5d388089-75a9-4e64-8fcf-575fde454708" (UID: "5d388089-75a9-4e64-8fcf-575fde454708") : error deleting /var/lib/kubelet/pods/5d388089-75a9-4e64-8fcf-575fde454708/volume-subpaths: remove /var/lib/kubelet/pods/5d388089-75a9-4e64-8fcf-575fde454708/volume-subpaths: no such file or directory Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.792493 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.795934 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d388089-75a9-4e64-8fcf-575fde454708-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d388089-75a9-4e64-8fcf-575fde454708" (UID: "5d388089-75a9-4e64-8fcf-575fde454708"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.802863 4745 scope.go:117] "RemoveContainer" containerID="5e8c79b90f4d2a11dc79650cd3c734e513694255f605827a249017e68a0d2eae" Dec 09 11:57:54 crc kubenswrapper[4745]: E1209 11:57:54.810802 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e8c79b90f4d2a11dc79650cd3c734e513694255f605827a249017e68a0d2eae\": container with ID starting with 5e8c79b90f4d2a11dc79650cd3c734e513694255f605827a249017e68a0d2eae not found: ID does not exist" containerID="5e8c79b90f4d2a11dc79650cd3c734e513694255f605827a249017e68a0d2eae" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.811538 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e8c79b90f4d2a11dc79650cd3c734e513694255f605827a249017e68a0d2eae"} err="failed to get container status \"5e8c79b90f4d2a11dc79650cd3c734e513694255f605827a249017e68a0d2eae\": rpc error: code = NotFound desc = could not find container \"5e8c79b90f4d2a11dc79650cd3c734e513694255f605827a249017e68a0d2eae\": container with ID starting with 5e8c79b90f4d2a11dc79650cd3c734e513694255f605827a249017e68a0d2eae not found: ID does not exist" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.811761 4745 scope.go:117] "RemoveContainer" containerID="ef71441cc3ee46bc1ee03800d7cc412e88722d8e33ae675352d43ada5e7b3e84" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.868947 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.881166 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a3f188-f451-4895-b500-52a9f7877d00-combined-ca-bundle\") pod \"94a3f188-f451-4895-b500-52a9f7877d00\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.881405 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-operator-scripts\") pod \"94a3f188-f451-4895-b500-52a9f7877d00\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.881962 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.882144 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"94a3f188-f451-4895-b500-52a9f7877d00\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.882181 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-config-data-default\") pod \"94a3f188-f451-4895-b500-52a9f7877d00\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.882225 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/94a3f188-f451-4895-b500-52a9f7877d00-galera-tls-certs\") pod \"94a3f188-f451-4895-b500-52a9f7877d00\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.882378 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/94a3f188-f451-4895-b500-52a9f7877d00-config-data-generated\") pod \"94a3f188-f451-4895-b500-52a9f7877d00\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.882437 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-kolla-config\") pod \"94a3f188-f451-4895-b500-52a9f7877d00\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.882473 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt7dv\" (UniqueName: \"kubernetes.io/projected/94a3f188-f451-4895-b500-52a9f7877d00-kube-api-access-vt7dv\") pod \"94a3f188-f451-4895-b500-52a9f7877d00\" (UID: \"94a3f188-f451-4895-b500-52a9f7877d00\") " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.882476 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94a3f188-f451-4895-b500-52a9f7877d00" (UID: "94a3f188-f451-4895-b500-52a9f7877d00"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.883475 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn8g6\" (UniqueName: \"kubernetes.io/projected/5d388089-75a9-4e64-8fcf-575fde454708-kube-api-access-rn8g6\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.883640 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d388089-75a9-4e64-8fcf-575fde454708-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.883655 4745 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.887783 4745 scope.go:117] "RemoveContainer" containerID="f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.888222 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a3f188-f451-4895-b500-52a9f7877d00-kube-api-access-vt7dv" (OuterVolumeSpecName: "kube-api-access-vt7dv") pod "94a3f188-f451-4895-b500-52a9f7877d00" (UID: "94a3f188-f451-4895-b500-52a9f7877d00"). InnerVolumeSpecName "kube-api-access-vt7dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.888536 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a3f188-f451-4895-b500-52a9f7877d00-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "94a3f188-f451-4895-b500-52a9f7877d00" (UID: "94a3f188-f451-4895-b500-52a9f7877d00"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.888966 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "94a3f188-f451-4895-b500-52a9f7877d00" (UID: "94a3f188-f451-4895-b500-52a9f7877d00"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.890650 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "94a3f188-f451-4895-b500-52a9f7877d00" (UID: "94a3f188-f451-4895-b500-52a9f7877d00"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.911368 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94a3f188-f451-4895-b500-52a9f7877d00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94a3f188-f451-4895-b500-52a9f7877d00" (UID: "94a3f188-f451-4895-b500-52a9f7877d00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.913500 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "94a3f188-f451-4895-b500-52a9f7877d00" (UID: "94a3f188-f451-4895-b500-52a9f7877d00"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.916716 4745 scope.go:117] "RemoveContainer" containerID="ef71441cc3ee46bc1ee03800d7cc412e88722d8e33ae675352d43ada5e7b3e84" Dec 09 11:57:54 crc kubenswrapper[4745]: E1209 11:57:54.919386 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef71441cc3ee46bc1ee03800d7cc412e88722d8e33ae675352d43ada5e7b3e84\": container with ID starting with ef71441cc3ee46bc1ee03800d7cc412e88722d8e33ae675352d43ada5e7b3e84 not found: ID does not exist" containerID="ef71441cc3ee46bc1ee03800d7cc412e88722d8e33ae675352d43ada5e7b3e84" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.919436 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef71441cc3ee46bc1ee03800d7cc412e88722d8e33ae675352d43ada5e7b3e84"} err="failed to get container status \"ef71441cc3ee46bc1ee03800d7cc412e88722d8e33ae675352d43ada5e7b3e84\": rpc error: code = NotFound desc = could not find container \"ef71441cc3ee46bc1ee03800d7cc412e88722d8e33ae675352d43ada5e7b3e84\": container with ID starting with ef71441cc3ee46bc1ee03800d7cc412e88722d8e33ae675352d43ada5e7b3e84 not found: ID does not exist" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.919465 4745 scope.go:117] "RemoveContainer" containerID="f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c" Dec 09 11:57:54 crc kubenswrapper[4745]: E1209 11:57:54.919734 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c\": container with ID starting with f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c not found: ID does not exist" containerID="f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.919756 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c"} err="failed to get container status \"f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c\": rpc error: code = NotFound desc = could not find container \"f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c\": container with ID starting with f58d4ebad9deed9951f91d864788f5eb08a4996955619172fb6a2a132012728c not found: ID does not exist" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.938117 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94a3f188-f451-4895-b500-52a9f7877d00-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "94a3f188-f451-4895-b500-52a9f7877d00" (UID: "94a3f188-f451-4895-b500-52a9f7877d00"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.985782 4745 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.985827 4745 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.985844 4745 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/94a3f188-f451-4895-b500-52a9f7877d00-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.985855 4745 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/94a3f188-f451-4895-b500-52a9f7877d00-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.985865 4745 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/94a3f188-f451-4895-b500-52a9f7877d00-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.985877 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt7dv\" (UniqueName: \"kubernetes.io/projected/94a3f188-f451-4895-b500-52a9f7877d00-kube-api-access-vt7dv\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:54 crc kubenswrapper[4745]: I1209 11:57:54.985887 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a3f188-f451-4895-b500-52a9f7877d00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.005425 4745 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.025493 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.124159 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkswp\" (UniqueName: \"kubernetes.io/projected/ff59337d-f366-446b-9752-eb371ee468e4-kube-api-access-hkswp\") pod \"ff59337d-f366-446b-9752-eb371ee468e4\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.124244 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-fernet-keys\") pod \"ff59337d-f366-446b-9752-eb371ee468e4\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.124347 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-public-tls-certs\") pod \"ff59337d-f366-446b-9752-eb371ee468e4\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.124363 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-internal-tls-certs\") pod \"ff59337d-f366-446b-9752-eb371ee468e4\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.124429 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-scripts\") pod \"ff59337d-f366-446b-9752-eb371ee468e4\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.124464 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-credential-keys\") pod \"ff59337d-f366-446b-9752-eb371ee468e4\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.124523 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-config-data\") pod \"ff59337d-f366-446b-9752-eb371ee468e4\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.124569 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-combined-ca-bundle\") pod \"ff59337d-f366-446b-9752-eb371ee468e4\" (UID: \"ff59337d-f366-446b-9752-eb371ee468e4\") " Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.124914 4745 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.134123 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ff59337d-f366-446b-9752-eb371ee468e4" (UID: "ff59337d-f366-446b-9752-eb371ee468e4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.140984 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff59337d-f366-446b-9752-eb371ee468e4-kube-api-access-hkswp" (OuterVolumeSpecName: "kube-api-access-hkswp") pod "ff59337d-f366-446b-9752-eb371ee468e4" (UID: "ff59337d-f366-446b-9752-eb371ee468e4"). InnerVolumeSpecName "kube-api-access-hkswp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.144854 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ff59337d-f366-446b-9752-eb371ee468e4" (UID: "ff59337d-f366-446b-9752-eb371ee468e4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.149928 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-scripts" (OuterVolumeSpecName: "scripts") pod "ff59337d-f366-446b-9752-eb371ee468e4" (UID: "ff59337d-f366-446b-9752-eb371ee468e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.160060 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff59337d-f366-446b-9752-eb371ee468e4" (UID: "ff59337d-f366-446b-9752-eb371ee468e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.177623 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-config-data" (OuterVolumeSpecName: "config-data") pod "ff59337d-f366-446b-9752-eb371ee468e4" (UID: "ff59337d-f366-446b-9752-eb371ee468e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.187177 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ff59337d-f366-446b-9752-eb371ee468e4" (UID: "ff59337d-f366-446b-9752-eb371ee468e4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.194264 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ff59337d-f366-446b-9752-eb371ee468e4" (UID: "ff59337d-f366-446b-9752-eb371ee468e4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.227288 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.227345 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkswp\" (UniqueName: \"kubernetes.io/projected/ff59337d-f366-446b-9752-eb371ee468e4-kube-api-access-hkswp\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.227363 4745 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.227373 4745 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.227382 4745 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.227391 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.227400 4745 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.227408 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff59337d-f366-446b-9752-eb371ee468e4-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.238258 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57876487f8-zgj8m" podUID="600c553e-f8e5-4ec6-94e7-2981abc748cb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.162:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.241821 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57876487f8-zgj8m" podUID="600c553e-f8e5-4ec6-94e7-2981abc748cb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.162:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.328419 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d388089-75a9-4e64-8fcf-575fde454708-config-data\") pod \"5d388089-75a9-4e64-8fcf-575fde454708\" (UID: \"5d388089-75a9-4e64-8fcf-575fde454708\") " Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.340553 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d388089-75a9-4e64-8fcf-575fde454708-config-data" (OuterVolumeSpecName: "config-data") pod "5d388089-75a9-4e64-8fcf-575fde454708" (UID: "5d388089-75a9-4e64-8fcf-575fde454708"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.412716 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.418662 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.430661 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d388089-75a9-4e64-8fcf-575fde454708-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.566435 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0da87c5d-f709-4d9b-b182-421edbb61f00" path="/var/lib/kubelet/pods/0da87c5d-f709-4d9b-b182-421edbb61f00/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.567203 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f786d16-8a6e-420b-b2b7-f785386e2191" path="/var/lib/kubelet/pods/0f786d16-8a6e-420b-b2b7-f785386e2191/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.567965 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12417d7c-b1ee-406f-b60c-415562c15782" path="/var/lib/kubelet/pods/12417d7c-b1ee-406f-b60c-415562c15782/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.568548 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b254d08-6ebd-492e-b955-afc9bed1b627" path="/var/lib/kubelet/pods/1b254d08-6ebd-492e-b955-afc9bed1b627/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.570575 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b87df09-ea26-4c97-bcd6-4ee7c6250d00" path="/var/lib/kubelet/pods/1b87df09-ea26-4c97-bcd6-4ee7c6250d00/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.571033 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2018b5a5-625c-4595-a26f-1f4a6df2bb90" path="/var/lib/kubelet/pods/2018b5a5-625c-4595-a26f-1f4a6df2bb90/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.571565 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="390bc098-2623-4898-b666-0e615bfa815a" path="/var/lib/kubelet/pods/390bc098-2623-4898-b666-0e615bfa815a/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.572607 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392b878a-37ba-4887-a699-672c8b92e947" path="/var/lib/kubelet/pods/392b878a-37ba-4887-a699-672c8b92e947/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.573107 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40d08272-d758-4404-aae4-a64897dfbab8" path="/var/lib/kubelet/pods/40d08272-d758-4404-aae4-a64897dfbab8/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.573720 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4658eac8-46b0-448b-8bc7-7c783fcef1c6" path="/var/lib/kubelet/pods/4658eac8-46b0-448b-8bc7-7c783fcef1c6/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.575138 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48a868ce-c1ab-457a-bd7f-224f8e982a13" path="/var/lib/kubelet/pods/48a868ce-c1ab-457a-bd7f-224f8e982a13/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.576418 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b860955-30eb-40e6-bd56-caf6098aed8a" path="/var/lib/kubelet/pods/5b860955-30eb-40e6-bd56-caf6098aed8a/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.577086 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d388089-75a9-4e64-8fcf-575fde454708" path="/var/lib/kubelet/pods/5d388089-75a9-4e64-8fcf-575fde454708/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.578185 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="600c553e-f8e5-4ec6-94e7-2981abc748cb" path="/var/lib/kubelet/pods/600c553e-f8e5-4ec6-94e7-2981abc748cb/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.578967 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="707152c6-0c94-4f5b-8b1a-0ca318f9fe92" path="/var/lib/kubelet/pods/707152c6-0c94-4f5b-8b1a-0ca318f9fe92/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.580057 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73ed8df4-e28f-4c76-bca1-a3d77ef789d4" path="/var/lib/kubelet/pods/73ed8df4-e28f-4c76-bca1-a3d77ef789d4/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.580988 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="753eb7ac-080d-4a10-9bba-e3ed44d80985" path="/var/lib/kubelet/pods/753eb7ac-080d-4a10-9bba-e3ed44d80985/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.581598 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84ab78e7-7419-4892-92c0-085db552be56" path="/var/lib/kubelet/pods/84ab78e7-7419-4892-92c0-085db552be56/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.582063 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88802adc-d164-420b-98d2-a757b6627350" path="/var/lib/kubelet/pods/88802adc-d164-420b-98d2-a757b6627350/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.583106 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b9f661d-4261-4e54-883d-cb0e7479a3d2" path="/var/lib/kubelet/pods/8b9f661d-4261-4e54-883d-cb0e7479a3d2/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.583840 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ca899aa-fd10-43e0-ae58-a3a4ae2a4066" path="/var/lib/kubelet/pods/8ca899aa-fd10-43e0-ae58-a3a4ae2a4066/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.584300 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a358310a-9254-4ca9-865c-4af37de2791a" path="/var/lib/kubelet/pods/a358310a-9254-4ca9-865c-4af37de2791a/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.585248 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a506f944-5b99-48af-a714-e24782ba1c06" path="/var/lib/kubelet/pods/a506f944-5b99-48af-a714-e24782ba1c06/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.585624 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7755562-a5b1-4b4d-833c-5a3179a2926c" path="/var/lib/kubelet/pods/a7755562-a5b1-4b4d-833c-5a3179a2926c/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.585944 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c180eac0-e93f-4067-ba6d-32a023f424e6" path="/var/lib/kubelet/pods/c180eac0-e93f-4067-ba6d-32a023f424e6/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.587645 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceba626e-26d1-495f-b88d-fed69e445ddb" path="/var/lib/kubelet/pods/ceba626e-26d1-495f-b88d-fed69e445ddb/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.588697 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da94483d-f361-42ef-95b4-d4b2c79b4d80" path="/var/lib/kubelet/pods/da94483d-f361-42ef-95b4-d4b2c79b4d80/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.589435 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e822e9c0-d6fa-4880-a0e3-8dfb32405a6f" path="/var/lib/kubelet/pods/e822e9c0-d6fa-4880-a0e3-8dfb32405a6f/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.590827 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaef3c48-5e7c-4ea3-a2d0-da44ea528455" path="/var/lib/kubelet/pods/eaef3c48-5e7c-4ea3-a2d0-da44ea528455/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.591982 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb863088-6993-45cc-8d6a-bd1a7d6f403a" path="/var/lib/kubelet/pods/eb863088-6993-45cc-8d6a-bd1a7d6f403a/volumes" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.802699 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"94a3f188-f451-4895-b500-52a9f7877d00","Type":"ContainerDied","Data":"24972443a6308fdfde1ad75c5910db1a23c0646187375bf6dc56b9be1cd702f9"} Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.802760 4745 scope.go:117] "RemoveContainer" containerID="38929371712d3d60dc685cf4a33c96454c4b248c83eff828bff29196ac2964c6" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.802912 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.810545 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56c9f47db8-wmdgz" event={"ID":"ff59337d-f366-446b-9752-eb371ee468e4","Type":"ContainerDied","Data":"6ae9c39299fade370baa6f26261ff6c95df9c9b15cbf11b461365f6b474c6e83"} Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.810496 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56c9f47db8-wmdgz" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.842902 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.845996 4745 scope.go:117] "RemoveContainer" containerID="c14f0fa5e7076677ce1b3c78038a8595a6de35d44d5b0b1d076e1b999699eed6" Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.851837 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.863598 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-56c9f47db8-wmdgz"] Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.879168 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-56c9f47db8-wmdgz"] Dec 09 11:57:55 crc kubenswrapper[4745]: I1209 11:57:55.903846 4745 scope.go:117] "RemoveContainer" containerID="e26931e675e3e053e3c67af648d2433105486188c6e62356a6e8b58996656df4" Dec 09 11:57:56 crc kubenswrapper[4745]: E1209 11:57:56.290587 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:57:56 crc kubenswrapper[4745]: E1209 11:57:56.291318 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:57:56 crc kubenswrapper[4745]: E1209 11:57:56.291741 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:57:56 crc kubenswrapper[4745]: E1209 11:57:56.291789 4745 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-zdn2w" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovsdb-server" Dec 09 11:57:56 crc kubenswrapper[4745]: E1209 11:57:56.292268 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:57:56 crc kubenswrapper[4745]: E1209 11:57:56.294501 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:57:56 crc kubenswrapper[4745]: E1209 11:57:56.297390 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:57:56 crc kubenswrapper[4745]: E1209 11:57:56.297433 4745 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-zdn2w" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovs-vswitchd" Dec 09 11:57:56 crc kubenswrapper[4745]: I1209 11:57:56.832433 4745 generic.go:334] "Generic (PLEG): container finished" podID="f069021c-4758-4a29-98a5-2952a693cef9" containerID="366b1b96d54298d8ed7a757e883bb67d2c254c7379c342add2a12b68039fea8b" exitCode=0 Dec 09 11:57:56 crc kubenswrapper[4745]: I1209 11:57:56.832516 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b68fbfd5-bx5sx" event={"ID":"f069021c-4758-4a29-98a5-2952a693cef9","Type":"ContainerDied","Data":"366b1b96d54298d8ed7a757e883bb67d2c254c7379c342add2a12b68039fea8b"} Dec 09 11:57:56 crc kubenswrapper[4745]: I1209 11:57:56.835975 4745 generic.go:334] "Generic (PLEG): container finished" podID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerID="c29448a85553503fc1fcb48e56752ab6bca8a6dd9078d1c9f160ca8db10b194d" exitCode=0 Dec 09 11:57:56 crc kubenswrapper[4745]: I1209 11:57:56.836064 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41d93c20-efbc-41c3-bea1-7e7dad1ae70d","Type":"ContainerDied","Data":"c29448a85553503fc1fcb48e56752ab6bca8a6dd9078d1c9f160ca8db10b194d"} Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.016107 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.153689 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.161885 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr99t\" (UniqueName: \"kubernetes.io/projected/f069021c-4758-4a29-98a5-2952a693cef9-kube-api-access-jr99t\") pod \"f069021c-4758-4a29-98a5-2952a693cef9\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.161962 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-combined-ca-bundle\") pod \"f069021c-4758-4a29-98a5-2952a693cef9\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.162013 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-ovndb-tls-certs\") pod \"f069021c-4758-4a29-98a5-2952a693cef9\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.162045 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-internal-tls-certs\") pod \"f069021c-4758-4a29-98a5-2952a693cef9\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.162076 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-config\") pod \"f069021c-4758-4a29-98a5-2952a693cef9\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.162123 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-public-tls-certs\") pod \"f069021c-4758-4a29-98a5-2952a693cef9\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.162255 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-httpd-config\") pod \"f069021c-4758-4a29-98a5-2952a693cef9\" (UID: \"f069021c-4758-4a29-98a5-2952a693cef9\") " Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.168559 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f069021c-4758-4a29-98a5-2952a693cef9" (UID: "f069021c-4758-4a29-98a5-2952a693cef9"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.171118 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f069021c-4758-4a29-98a5-2952a693cef9-kube-api-access-jr99t" (OuterVolumeSpecName: "kube-api-access-jr99t") pod "f069021c-4758-4a29-98a5-2952a693cef9" (UID: "f069021c-4758-4a29-98a5-2952a693cef9"). InnerVolumeSpecName "kube-api-access-jr99t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.219674 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f069021c-4758-4a29-98a5-2952a693cef9" (UID: "f069021c-4758-4a29-98a5-2952a693cef9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.226279 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f069021c-4758-4a29-98a5-2952a693cef9" (UID: "f069021c-4758-4a29-98a5-2952a693cef9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.237500 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-config" (OuterVolumeSpecName: "config") pod "f069021c-4758-4a29-98a5-2952a693cef9" (UID: "f069021c-4758-4a29-98a5-2952a693cef9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.243533 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f069021c-4758-4a29-98a5-2952a693cef9" (UID: "f069021c-4758-4a29-98a5-2952a693cef9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.244846 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f069021c-4758-4a29-98a5-2952a693cef9" (UID: "f069021c-4758-4a29-98a5-2952a693cef9"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.263372 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-log-httpd\") pod \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.263460 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-sg-core-conf-yaml\") pod \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.263490 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-run-httpd\") pod \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.263569 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-config-data\") pod \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.263657 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-ceilometer-tls-certs\") pod \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.263701 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzngt\" (UniqueName: \"kubernetes.io/projected/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-kube-api-access-tzngt\") pod \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.263735 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-combined-ca-bundle\") pod \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.263793 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-scripts\") pod \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\" (UID: \"41d93c20-efbc-41c3-bea1-7e7dad1ae70d\") " Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.263987 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "41d93c20-efbc-41c3-bea1-7e7dad1ae70d" (UID: "41d93c20-efbc-41c3-bea1-7e7dad1ae70d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.264085 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr99t\" (UniqueName: \"kubernetes.io/projected/f069021c-4758-4a29-98a5-2952a693cef9-kube-api-access-jr99t\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.264106 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.264118 4745 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.264126 4745 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.264136 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.264146 4745 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.264157 4745 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f069021c-4758-4a29-98a5-2952a693cef9-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.267546 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "41d93c20-efbc-41c3-bea1-7e7dad1ae70d" (UID: "41d93c20-efbc-41c3-bea1-7e7dad1ae70d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.272194 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-kube-api-access-tzngt" (OuterVolumeSpecName: "kube-api-access-tzngt") pod "41d93c20-efbc-41c3-bea1-7e7dad1ae70d" (UID: "41d93c20-efbc-41c3-bea1-7e7dad1ae70d"). InnerVolumeSpecName "kube-api-access-tzngt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.272990 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-scripts" (OuterVolumeSpecName: "scripts") pod "41d93c20-efbc-41c3-bea1-7e7dad1ae70d" (UID: "41d93c20-efbc-41c3-bea1-7e7dad1ae70d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.286881 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "41d93c20-efbc-41c3-bea1-7e7dad1ae70d" (UID: "41d93c20-efbc-41c3-bea1-7e7dad1ae70d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.317366 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "41d93c20-efbc-41c3-bea1-7e7dad1ae70d" (UID: "41d93c20-efbc-41c3-bea1-7e7dad1ae70d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.324860 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41d93c20-efbc-41c3-bea1-7e7dad1ae70d" (UID: "41d93c20-efbc-41c3-bea1-7e7dad1ae70d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.365364 4745 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.365410 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzngt\" (UniqueName: \"kubernetes.io/projected/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-kube-api-access-tzngt\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.365421 4745 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.365431 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.365440 4745 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.365448 4745 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.365458 4745 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.372691 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-config-data" (OuterVolumeSpecName: "config-data") pod "41d93c20-efbc-41c3-bea1-7e7dad1ae70d" (UID: "41d93c20-efbc-41c3-bea1-7e7dad1ae70d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.467035 4745 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41d93c20-efbc-41c3-bea1-7e7dad1ae70d-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.570063 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a3f188-f451-4895-b500-52a9f7877d00" path="/var/lib/kubelet/pods/94a3f188-f451-4895-b500-52a9f7877d00/volumes" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.570820 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff59337d-f366-446b-9752-eb371ee468e4" path="/var/lib/kubelet/pods/ff59337d-f366-446b-9752-eb371ee468e4/volumes" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.852991 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b68fbfd5-bx5sx" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.853005 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b68fbfd5-bx5sx" event={"ID":"f069021c-4758-4a29-98a5-2952a693cef9","Type":"ContainerDied","Data":"90bead3ede055afdc07bafce2c445e1e313577dd1e987bb5e0e9306ae04c1227"} Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.853086 4745 scope.go:117] "RemoveContainer" containerID="3c97425c86c2cd129c5f56303b9b231be4792489fd433e4c00bc829ef51d297c" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.857429 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41d93c20-efbc-41c3-bea1-7e7dad1ae70d","Type":"ContainerDied","Data":"58538038be508da88785afc2f41901e85da407d5338aff2923f4c3727138ffb2"} Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.857691 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.885611 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.886765 4745 scope.go:117] "RemoveContainer" containerID="366b1b96d54298d8ed7a757e883bb67d2c254c7379c342add2a12b68039fea8b" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.904915 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.910715 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b68fbfd5-bx5sx"] Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.919346 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5b68fbfd5-bx5sx"] Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.929747 4745 scope.go:117] "RemoveContainer" containerID="29da6f720b0b04eb76a45efbd214f1c1f0273bc06fc58a52d1909cd993ab7077" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.957038 4745 scope.go:117] "RemoveContainer" containerID="2e2d1a1072494cde394a0d7349ffb4decaf89624f4008fb775c18a80d492e831" Dec 09 11:57:57 crc kubenswrapper[4745]: I1209 11:57:57.978695 4745 scope.go:117] "RemoveContainer" containerID="c29448a85553503fc1fcb48e56752ab6bca8a6dd9078d1c9f160ca8db10b194d" Dec 09 11:57:58 crc kubenswrapper[4745]: I1209 11:57:58.002945 4745 scope.go:117] "RemoveContainer" containerID="5fd672e48430bca4d1c2a71daa80ccf27bd2fcb9360ab8501dfd3906ec36e53e" Dec 09 11:57:59 crc kubenswrapper[4745]: I1209 11:57:59.565015 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" path="/var/lib/kubelet/pods/41d93c20-efbc-41c3-bea1-7e7dad1ae70d/volumes" Dec 09 11:57:59 crc kubenswrapper[4745]: I1209 11:57:59.566318 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f069021c-4758-4a29-98a5-2952a693cef9" path="/var/lib/kubelet/pods/f069021c-4758-4a29-98a5-2952a693cef9/volumes" Dec 09 11:58:01 crc kubenswrapper[4745]: E1209 11:58:01.292240 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:58:01 crc kubenswrapper[4745]: E1209 11:58:01.292635 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:58:01 crc kubenswrapper[4745]: E1209 11:58:01.292980 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:58:01 crc kubenswrapper[4745]: E1209 11:58:01.293245 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:58:01 crc kubenswrapper[4745]: E1209 11:58:01.293283 4745 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-zdn2w" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovsdb-server" Dec 09 11:58:01 crc kubenswrapper[4745]: E1209 11:58:01.294575 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:58:01 crc kubenswrapper[4745]: E1209 11:58:01.301673 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:58:01 crc kubenswrapper[4745]: E1209 11:58:01.301757 4745 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-zdn2w" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovs-vswitchd" Dec 09 11:58:01 crc kubenswrapper[4745]: E1209 11:58:01.451294 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7755562_a5b1_4b4d_833c_5a3179a2926c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda506f944_5b99_48af_a714_e24782ba1c06.slice/crio-75f05b692bee59a7c8306779305f38ca46ce05f0854a343e4dc5681433e2f1d8\": RecentStats: unable to find data in memory cache]" Dec 09 11:58:03 crc kubenswrapper[4745]: I1209 11:58:03.559440 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 11:58:03 crc kubenswrapper[4745]: E1209 11:58:03.560623 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 11:58:06 crc kubenswrapper[4745]: E1209 11:58:06.291070 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:58:06 crc kubenswrapper[4745]: E1209 11:58:06.293215 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:58:06 crc kubenswrapper[4745]: E1209 11:58:06.294368 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:58:06 crc kubenswrapper[4745]: E1209 11:58:06.295184 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:58:06 crc kubenswrapper[4745]: E1209 11:58:06.295312 4745 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-zdn2w" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovsdb-server" Dec 09 11:58:06 crc kubenswrapper[4745]: E1209 11:58:06.295864 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:58:06 crc kubenswrapper[4745]: E1209 11:58:06.298544 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:58:06 crc kubenswrapper[4745]: E1209 11:58:06.298771 4745 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-zdn2w" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovs-vswitchd" Dec 09 11:58:11 crc kubenswrapper[4745]: E1209 11:58:11.291151 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:58:11 crc kubenswrapper[4745]: E1209 11:58:11.292230 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:58:11 crc kubenswrapper[4745]: E1209 11:58:11.292388 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:58:11 crc kubenswrapper[4745]: E1209 11:58:11.296172 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 11:58:11 crc kubenswrapper[4745]: E1209 11:58:11.296256 4745 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-zdn2w" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovsdb-server" Dec 09 11:58:11 crc kubenswrapper[4745]: E1209 11:58:11.299432 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:58:11 crc kubenswrapper[4745]: E1209 11:58:11.303036 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 11:58:11 crc kubenswrapper[4745]: E1209 11:58:11.303095 4745 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-zdn2w" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovs-vswitchd" Dec 09 11:58:11 crc kubenswrapper[4745]: E1209 11:58:11.649085 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7755562_a5b1_4b4d_833c_5a3179a2926c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda506f944_5b99_48af_a714_e24782ba1c06.slice/crio-75f05b692bee59a7c8306779305f38ca46ce05f0854a343e4dc5681433e2f1d8\": RecentStats: unable to find data in memory cache]" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.033082 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zdn2w_c38b2a61-5161-4132-be1a-65e25531e73a/ovs-vswitchd/0.log" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.035172 4745 generic.go:334] "Generic (PLEG): container finished" podID="c38b2a61-5161-4132-be1a-65e25531e73a" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" exitCode=137 Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.035224 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zdn2w" event={"ID":"c38b2a61-5161-4132-be1a-65e25531e73a","Type":"ContainerDied","Data":"d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390"} Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.315420 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zdn2w_c38b2a61-5161-4132-be1a-65e25531e73a/ovs-vswitchd/0.log" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.316856 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.381248 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-log\") pod \"c38b2a61-5161-4132-be1a-65e25531e73a\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.381352 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2qfq\" (UniqueName: \"kubernetes.io/projected/c38b2a61-5161-4132-be1a-65e25531e73a-kube-api-access-j2qfq\") pod \"c38b2a61-5161-4132-be1a-65e25531e73a\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.381421 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-run\") pod \"c38b2a61-5161-4132-be1a-65e25531e73a\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.381606 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-etc-ovs\") pod \"c38b2a61-5161-4132-be1a-65e25531e73a\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.381666 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c38b2a61-5161-4132-be1a-65e25531e73a-scripts\") pod \"c38b2a61-5161-4132-be1a-65e25531e73a\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.381706 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-log" (OuterVolumeSpecName: "var-log") pod "c38b2a61-5161-4132-be1a-65e25531e73a" (UID: "c38b2a61-5161-4132-be1a-65e25531e73a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.381774 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-lib\") pod \"c38b2a61-5161-4132-be1a-65e25531e73a\" (UID: \"c38b2a61-5161-4132-be1a-65e25531e73a\") " Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.381836 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-lib" (OuterVolumeSpecName: "var-lib") pod "c38b2a61-5161-4132-be1a-65e25531e73a" (UID: "c38b2a61-5161-4132-be1a-65e25531e73a"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.381889 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-run" (OuterVolumeSpecName: "var-run") pod "c38b2a61-5161-4132-be1a-65e25531e73a" (UID: "c38b2a61-5161-4132-be1a-65e25531e73a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.381913 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "c38b2a61-5161-4132-be1a-65e25531e73a" (UID: "c38b2a61-5161-4132-be1a-65e25531e73a"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.382642 4745 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-lib\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.382660 4745 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-log\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.382669 4745 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-var-run\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.382678 4745 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c38b2a61-5161-4132-be1a-65e25531e73a-etc-ovs\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.383791 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c38b2a61-5161-4132-be1a-65e25531e73a-scripts" (OuterVolumeSpecName: "scripts") pod "c38b2a61-5161-4132-be1a-65e25531e73a" (UID: "c38b2a61-5161-4132-be1a-65e25531e73a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.396936 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c38b2a61-5161-4132-be1a-65e25531e73a-kube-api-access-j2qfq" (OuterVolumeSpecName: "kube-api-access-j2qfq") pod "c38b2a61-5161-4132-be1a-65e25531e73a" (UID: "c38b2a61-5161-4132-be1a-65e25531e73a"). InnerVolumeSpecName "kube-api-access-j2qfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.484653 4745 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c38b2a61-5161-4132-be1a-65e25531e73a-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.484690 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2qfq\" (UniqueName: \"kubernetes.io/projected/c38b2a61-5161-4132-be1a-65e25531e73a-kube-api-access-j2qfq\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.553695 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.688799 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-cache\") pod \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.688910 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-lock\") pod \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.689031 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.689091 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8z72\" (UniqueName: \"kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-kube-api-access-v8z72\") pod \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.689163 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift\") pod \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\" (UID: \"71ebc86b-4ef3-4d3f-911f-93036c9ce19b\") " Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.689797 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-cache" (OuterVolumeSpecName: "cache") pod "71ebc86b-4ef3-4d3f-911f-93036c9ce19b" (UID: "71ebc86b-4ef3-4d3f-911f-93036c9ce19b"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.689824 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-lock" (OuterVolumeSpecName: "lock") pod "71ebc86b-4ef3-4d3f-911f-93036c9ce19b" (UID: "71ebc86b-4ef3-4d3f-911f-93036c9ce19b"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.690162 4745 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-cache\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.690181 4745 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-lock\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.694089 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "71ebc86b-4ef3-4d3f-911f-93036c9ce19b" (UID: "71ebc86b-4ef3-4d3f-911f-93036c9ce19b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.694337 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-kube-api-access-v8z72" (OuterVolumeSpecName: "kube-api-access-v8z72") pod "71ebc86b-4ef3-4d3f-911f-93036c9ce19b" (UID: "71ebc86b-4ef3-4d3f-911f-93036c9ce19b"). InnerVolumeSpecName "kube-api-access-v8z72". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.698317 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "swift") pod "71ebc86b-4ef3-4d3f-911f-93036c9ce19b" (UID: "71ebc86b-4ef3-4d3f-911f-93036c9ce19b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.792039 4745 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.792091 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8z72\" (UniqueName: \"kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-kube-api-access-v8z72\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.792102 4745 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/71ebc86b-4ef3-4d3f-911f-93036c9ce19b-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.808033 4745 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 09 11:58:15 crc kubenswrapper[4745]: I1209 11:58:15.893668 4745 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.052008 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zdn2w_c38b2a61-5161-4132-be1a-65e25531e73a/ovs-vswitchd/0.log" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.054241 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zdn2w" event={"ID":"c38b2a61-5161-4132-be1a-65e25531e73a","Type":"ContainerDied","Data":"b2913bd328e4cb8f4404382e672f30b5f034338f9527b4a46c11c6a720e37e7e"} Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.054322 4745 scope.go:117] "RemoveContainer" containerID="d8dc5bc6878bfb8e48d6ebde4d740d4d52c61db5eb87bf56c3b6879ae2217390" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.054609 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zdn2w" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.076280 4745 generic.go:334] "Generic (PLEG): container finished" podID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerID="da72e5c3801efd33947357e2fd984868e5b4f23df51fbc04f57825d0c946a003" exitCode=137 Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.076378 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerDied","Data":"da72e5c3801efd33947357e2fd984868e5b4f23df51fbc04f57825d0c946a003"} Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.076461 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.076860 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"71ebc86b-4ef3-4d3f-911f-93036c9ce19b","Type":"ContainerDied","Data":"605f894bbcb8a27461251589d7de2e0029db9035d8ce2c6a92d5182e8747c7d0"} Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.096444 4745 scope.go:117] "RemoveContainer" containerID="b1d5dd30334a459dd66b30968e513d7ba6bb8ffdb4f65c661a254176ab116253" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.101610 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-zdn2w"] Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.110917 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-zdn2w"] Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.126491 4745 scope.go:117] "RemoveContainer" containerID="a71d23fc91157e8ffa3a582e3c253443fba5f09c072196181f1042c72a332661" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.131279 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.139286 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.158998 4745 scope.go:117] "RemoveContainer" containerID="da72e5c3801efd33947357e2fd984868e5b4f23df51fbc04f57825d0c946a003" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.189692 4745 scope.go:117] "RemoveContainer" containerID="2bef905b24c86bdd149cc3b8f821b124b807b8515c256b0a8c7c6263b9eedcd7" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.220082 4745 scope.go:117] "RemoveContainer" containerID="a09eef6ded4878a8e32d6ba9cbd3dd19b6b7bfeec260ad7cfb501f71b11f4620" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.245449 4745 scope.go:117] "RemoveContainer" containerID="3b09c85bd7e17fb217f98509b4b794244fe5e3f0c26306c7c690e6a051489972" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.269839 4745 scope.go:117] "RemoveContainer" containerID="efabc77007bbbd62d845864253d28ab16751a3875bb948814d9433ce5750dc12" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.291327 4745 scope.go:117] "RemoveContainer" containerID="7b15301db669e893c5e60abe636eaa9c44f3893adf6b79d1697d2915c154c463" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.309416 4745 scope.go:117] "RemoveContainer" containerID="452aca3b6e3787e1a6aba6c354db611a2345a39e27b0e120f6ddf7586fe81934" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.328183 4745 scope.go:117] "RemoveContainer" containerID="c4b12f433424215b3190a84a7a70334fc6b3243836fe4b1b5a6c2aecc6c4f26b" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.346756 4745 scope.go:117] "RemoveContainer" containerID="affba02167b7f1a39157024b3315fc6007c26612f881997d6a5c2ee7f8c3a031" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.365942 4745 scope.go:117] "RemoveContainer" containerID="d1c5baae9ca212385d6390766968bb5c5c817b622698a6ebd63ce2e07c088b11" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.391603 4745 scope.go:117] "RemoveContainer" containerID="159077530d8b2bbf20cfe2054dbf825f2d68d49f59c36639bd872289904ce3f6" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.437010 4745 scope.go:117] "RemoveContainer" containerID="897333cc8f92ee902c0c7c820dc78dbe5dd0956289175db6c2e6cfd0c73135bd" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.460369 4745 scope.go:117] "RemoveContainer" containerID="864f05e319634c7f2e50a7ad6d5981f327821fb1cfb3fd4844b58c7d6634eebb" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.481023 4745 scope.go:117] "RemoveContainer" containerID="9331827add7fc120aedf6b88798a58ce524ffc136ae60e9e6ba61e68c91d8281" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.507310 4745 scope.go:117] "RemoveContainer" containerID="949bb722de609413bb37a922ca3be73fb6c7a4916d0fe96c8b8b977d069ac462" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.534696 4745 scope.go:117] "RemoveContainer" containerID="da72e5c3801efd33947357e2fd984868e5b4f23df51fbc04f57825d0c946a003" Dec 09 11:58:16 crc kubenswrapper[4745]: E1209 11:58:16.535628 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da72e5c3801efd33947357e2fd984868e5b4f23df51fbc04f57825d0c946a003\": container with ID starting with da72e5c3801efd33947357e2fd984868e5b4f23df51fbc04f57825d0c946a003 not found: ID does not exist" containerID="da72e5c3801efd33947357e2fd984868e5b4f23df51fbc04f57825d0c946a003" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.535695 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da72e5c3801efd33947357e2fd984868e5b4f23df51fbc04f57825d0c946a003"} err="failed to get container status \"da72e5c3801efd33947357e2fd984868e5b4f23df51fbc04f57825d0c946a003\": rpc error: code = NotFound desc = could not find container \"da72e5c3801efd33947357e2fd984868e5b4f23df51fbc04f57825d0c946a003\": container with ID starting with da72e5c3801efd33947357e2fd984868e5b4f23df51fbc04f57825d0c946a003 not found: ID does not exist" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.535781 4745 scope.go:117] "RemoveContainer" containerID="2bef905b24c86bdd149cc3b8f821b124b807b8515c256b0a8c7c6263b9eedcd7" Dec 09 11:58:16 crc kubenswrapper[4745]: E1209 11:58:16.536751 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bef905b24c86bdd149cc3b8f821b124b807b8515c256b0a8c7c6263b9eedcd7\": container with ID starting with 2bef905b24c86bdd149cc3b8f821b124b807b8515c256b0a8c7c6263b9eedcd7 not found: ID does not exist" containerID="2bef905b24c86bdd149cc3b8f821b124b807b8515c256b0a8c7c6263b9eedcd7" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.537176 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bef905b24c86bdd149cc3b8f821b124b807b8515c256b0a8c7c6263b9eedcd7"} err="failed to get container status \"2bef905b24c86bdd149cc3b8f821b124b807b8515c256b0a8c7c6263b9eedcd7\": rpc error: code = NotFound desc = could not find container \"2bef905b24c86bdd149cc3b8f821b124b807b8515c256b0a8c7c6263b9eedcd7\": container with ID starting with 2bef905b24c86bdd149cc3b8f821b124b807b8515c256b0a8c7c6263b9eedcd7 not found: ID does not exist" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.537231 4745 scope.go:117] "RemoveContainer" containerID="a09eef6ded4878a8e32d6ba9cbd3dd19b6b7bfeec260ad7cfb501f71b11f4620" Dec 09 11:58:16 crc kubenswrapper[4745]: E1209 11:58:16.538628 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09eef6ded4878a8e32d6ba9cbd3dd19b6b7bfeec260ad7cfb501f71b11f4620\": container with ID starting with a09eef6ded4878a8e32d6ba9cbd3dd19b6b7bfeec260ad7cfb501f71b11f4620 not found: ID does not exist" containerID="a09eef6ded4878a8e32d6ba9cbd3dd19b6b7bfeec260ad7cfb501f71b11f4620" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.538671 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09eef6ded4878a8e32d6ba9cbd3dd19b6b7bfeec260ad7cfb501f71b11f4620"} err="failed to get container status \"a09eef6ded4878a8e32d6ba9cbd3dd19b6b7bfeec260ad7cfb501f71b11f4620\": rpc error: code = NotFound desc = could not find container \"a09eef6ded4878a8e32d6ba9cbd3dd19b6b7bfeec260ad7cfb501f71b11f4620\": container with ID starting with a09eef6ded4878a8e32d6ba9cbd3dd19b6b7bfeec260ad7cfb501f71b11f4620 not found: ID does not exist" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.538693 4745 scope.go:117] "RemoveContainer" containerID="3b09c85bd7e17fb217f98509b4b794244fe5e3f0c26306c7c690e6a051489972" Dec 09 11:58:16 crc kubenswrapper[4745]: E1209 11:58:16.540020 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b09c85bd7e17fb217f98509b4b794244fe5e3f0c26306c7c690e6a051489972\": container with ID starting with 3b09c85bd7e17fb217f98509b4b794244fe5e3f0c26306c7c690e6a051489972 not found: ID does not exist" containerID="3b09c85bd7e17fb217f98509b4b794244fe5e3f0c26306c7c690e6a051489972" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.540118 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b09c85bd7e17fb217f98509b4b794244fe5e3f0c26306c7c690e6a051489972"} err="failed to get container status \"3b09c85bd7e17fb217f98509b4b794244fe5e3f0c26306c7c690e6a051489972\": rpc error: code = NotFound desc = could not find container \"3b09c85bd7e17fb217f98509b4b794244fe5e3f0c26306c7c690e6a051489972\": container with ID starting with 3b09c85bd7e17fb217f98509b4b794244fe5e3f0c26306c7c690e6a051489972 not found: ID does not exist" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.540170 4745 scope.go:117] "RemoveContainer" containerID="efabc77007bbbd62d845864253d28ab16751a3875bb948814d9433ce5750dc12" Dec 09 11:58:16 crc kubenswrapper[4745]: E1209 11:58:16.541221 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efabc77007bbbd62d845864253d28ab16751a3875bb948814d9433ce5750dc12\": container with ID starting with efabc77007bbbd62d845864253d28ab16751a3875bb948814d9433ce5750dc12 not found: ID does not exist" containerID="efabc77007bbbd62d845864253d28ab16751a3875bb948814d9433ce5750dc12" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.541283 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efabc77007bbbd62d845864253d28ab16751a3875bb948814d9433ce5750dc12"} err="failed to get container status \"efabc77007bbbd62d845864253d28ab16751a3875bb948814d9433ce5750dc12\": rpc error: code = NotFound desc = could not find container \"efabc77007bbbd62d845864253d28ab16751a3875bb948814d9433ce5750dc12\": container with ID starting with efabc77007bbbd62d845864253d28ab16751a3875bb948814d9433ce5750dc12 not found: ID does not exist" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.541313 4745 scope.go:117] "RemoveContainer" containerID="7b15301db669e893c5e60abe636eaa9c44f3893adf6b79d1697d2915c154c463" Dec 09 11:58:16 crc kubenswrapper[4745]: E1209 11:58:16.542163 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b15301db669e893c5e60abe636eaa9c44f3893adf6b79d1697d2915c154c463\": container with ID starting with 7b15301db669e893c5e60abe636eaa9c44f3893adf6b79d1697d2915c154c463 not found: ID does not exist" containerID="7b15301db669e893c5e60abe636eaa9c44f3893adf6b79d1697d2915c154c463" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.542249 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b15301db669e893c5e60abe636eaa9c44f3893adf6b79d1697d2915c154c463"} err="failed to get container status \"7b15301db669e893c5e60abe636eaa9c44f3893adf6b79d1697d2915c154c463\": rpc error: code = NotFound desc = could not find container \"7b15301db669e893c5e60abe636eaa9c44f3893adf6b79d1697d2915c154c463\": container with ID starting with 7b15301db669e893c5e60abe636eaa9c44f3893adf6b79d1697d2915c154c463 not found: ID does not exist" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.542293 4745 scope.go:117] "RemoveContainer" containerID="452aca3b6e3787e1a6aba6c354db611a2345a39e27b0e120f6ddf7586fe81934" Dec 09 11:58:16 crc kubenswrapper[4745]: E1209 11:58:16.543018 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"452aca3b6e3787e1a6aba6c354db611a2345a39e27b0e120f6ddf7586fe81934\": container with ID starting with 452aca3b6e3787e1a6aba6c354db611a2345a39e27b0e120f6ddf7586fe81934 not found: ID does not exist" containerID="452aca3b6e3787e1a6aba6c354db611a2345a39e27b0e120f6ddf7586fe81934" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.543044 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"452aca3b6e3787e1a6aba6c354db611a2345a39e27b0e120f6ddf7586fe81934"} err="failed to get container status \"452aca3b6e3787e1a6aba6c354db611a2345a39e27b0e120f6ddf7586fe81934\": rpc error: code = NotFound desc = could not find container \"452aca3b6e3787e1a6aba6c354db611a2345a39e27b0e120f6ddf7586fe81934\": container with ID starting with 452aca3b6e3787e1a6aba6c354db611a2345a39e27b0e120f6ddf7586fe81934 not found: ID does not exist" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.543060 4745 scope.go:117] "RemoveContainer" containerID="c4b12f433424215b3190a84a7a70334fc6b3243836fe4b1b5a6c2aecc6c4f26b" Dec 09 11:58:16 crc kubenswrapper[4745]: E1209 11:58:16.543370 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4b12f433424215b3190a84a7a70334fc6b3243836fe4b1b5a6c2aecc6c4f26b\": container with ID starting with c4b12f433424215b3190a84a7a70334fc6b3243836fe4b1b5a6c2aecc6c4f26b not found: ID does not exist" containerID="c4b12f433424215b3190a84a7a70334fc6b3243836fe4b1b5a6c2aecc6c4f26b" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.543387 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4b12f433424215b3190a84a7a70334fc6b3243836fe4b1b5a6c2aecc6c4f26b"} err="failed to get container status \"c4b12f433424215b3190a84a7a70334fc6b3243836fe4b1b5a6c2aecc6c4f26b\": rpc error: code = NotFound desc = could not find container \"c4b12f433424215b3190a84a7a70334fc6b3243836fe4b1b5a6c2aecc6c4f26b\": container with ID starting with c4b12f433424215b3190a84a7a70334fc6b3243836fe4b1b5a6c2aecc6c4f26b not found: ID does not exist" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.543401 4745 scope.go:117] "RemoveContainer" containerID="affba02167b7f1a39157024b3315fc6007c26612f881997d6a5c2ee7f8c3a031" Dec 09 11:58:16 crc kubenswrapper[4745]: E1209 11:58:16.543695 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"affba02167b7f1a39157024b3315fc6007c26612f881997d6a5c2ee7f8c3a031\": container with ID starting with affba02167b7f1a39157024b3315fc6007c26612f881997d6a5c2ee7f8c3a031 not found: ID does not exist" containerID="affba02167b7f1a39157024b3315fc6007c26612f881997d6a5c2ee7f8c3a031" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.543734 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"affba02167b7f1a39157024b3315fc6007c26612f881997d6a5c2ee7f8c3a031"} err="failed to get container status \"affba02167b7f1a39157024b3315fc6007c26612f881997d6a5c2ee7f8c3a031\": rpc error: code = NotFound desc = could not find container \"affba02167b7f1a39157024b3315fc6007c26612f881997d6a5c2ee7f8c3a031\": container with ID starting with affba02167b7f1a39157024b3315fc6007c26612f881997d6a5c2ee7f8c3a031 not found: ID does not exist" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.543745 4745 scope.go:117] "RemoveContainer" containerID="d1c5baae9ca212385d6390766968bb5c5c817b622698a6ebd63ce2e07c088b11" Dec 09 11:58:16 crc kubenswrapper[4745]: E1209 11:58:16.544194 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1c5baae9ca212385d6390766968bb5c5c817b622698a6ebd63ce2e07c088b11\": container with ID starting with d1c5baae9ca212385d6390766968bb5c5c817b622698a6ebd63ce2e07c088b11 not found: ID does not exist" containerID="d1c5baae9ca212385d6390766968bb5c5c817b622698a6ebd63ce2e07c088b11" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.544236 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1c5baae9ca212385d6390766968bb5c5c817b622698a6ebd63ce2e07c088b11"} err="failed to get container status \"d1c5baae9ca212385d6390766968bb5c5c817b622698a6ebd63ce2e07c088b11\": rpc error: code = NotFound desc = could not find container \"d1c5baae9ca212385d6390766968bb5c5c817b622698a6ebd63ce2e07c088b11\": container with ID starting with d1c5baae9ca212385d6390766968bb5c5c817b622698a6ebd63ce2e07c088b11 not found: ID does not exist" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.544267 4745 scope.go:117] "RemoveContainer" containerID="159077530d8b2bbf20cfe2054dbf825f2d68d49f59c36639bd872289904ce3f6" Dec 09 11:58:16 crc kubenswrapper[4745]: E1209 11:58:16.544668 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"159077530d8b2bbf20cfe2054dbf825f2d68d49f59c36639bd872289904ce3f6\": container with ID starting with 159077530d8b2bbf20cfe2054dbf825f2d68d49f59c36639bd872289904ce3f6 not found: ID does not exist" containerID="159077530d8b2bbf20cfe2054dbf825f2d68d49f59c36639bd872289904ce3f6" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.544713 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"159077530d8b2bbf20cfe2054dbf825f2d68d49f59c36639bd872289904ce3f6"} err="failed to get container status \"159077530d8b2bbf20cfe2054dbf825f2d68d49f59c36639bd872289904ce3f6\": rpc error: code = NotFound desc = could not find container \"159077530d8b2bbf20cfe2054dbf825f2d68d49f59c36639bd872289904ce3f6\": container with ID starting with 159077530d8b2bbf20cfe2054dbf825f2d68d49f59c36639bd872289904ce3f6 not found: ID does not exist" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.544733 4745 scope.go:117] "RemoveContainer" containerID="897333cc8f92ee902c0c7c820dc78dbe5dd0956289175db6c2e6cfd0c73135bd" Dec 09 11:58:16 crc kubenswrapper[4745]: E1209 11:58:16.545062 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"897333cc8f92ee902c0c7c820dc78dbe5dd0956289175db6c2e6cfd0c73135bd\": container with ID starting with 897333cc8f92ee902c0c7c820dc78dbe5dd0956289175db6c2e6cfd0c73135bd not found: ID does not exist" containerID="897333cc8f92ee902c0c7c820dc78dbe5dd0956289175db6c2e6cfd0c73135bd" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.545085 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897333cc8f92ee902c0c7c820dc78dbe5dd0956289175db6c2e6cfd0c73135bd"} err="failed to get container status \"897333cc8f92ee902c0c7c820dc78dbe5dd0956289175db6c2e6cfd0c73135bd\": rpc error: code = NotFound desc = could not find container \"897333cc8f92ee902c0c7c820dc78dbe5dd0956289175db6c2e6cfd0c73135bd\": container with ID starting with 897333cc8f92ee902c0c7c820dc78dbe5dd0956289175db6c2e6cfd0c73135bd not found: ID does not exist" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.545105 4745 scope.go:117] "RemoveContainer" containerID="864f05e319634c7f2e50a7ad6d5981f327821fb1cfb3fd4844b58c7d6634eebb" Dec 09 11:58:16 crc kubenswrapper[4745]: E1209 11:58:16.545451 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"864f05e319634c7f2e50a7ad6d5981f327821fb1cfb3fd4844b58c7d6634eebb\": container with ID starting with 864f05e319634c7f2e50a7ad6d5981f327821fb1cfb3fd4844b58c7d6634eebb not found: ID does not exist" containerID="864f05e319634c7f2e50a7ad6d5981f327821fb1cfb3fd4844b58c7d6634eebb" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.545611 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"864f05e319634c7f2e50a7ad6d5981f327821fb1cfb3fd4844b58c7d6634eebb"} err="failed to get container status \"864f05e319634c7f2e50a7ad6d5981f327821fb1cfb3fd4844b58c7d6634eebb\": rpc error: code = NotFound desc = could not find container \"864f05e319634c7f2e50a7ad6d5981f327821fb1cfb3fd4844b58c7d6634eebb\": container with ID starting with 864f05e319634c7f2e50a7ad6d5981f327821fb1cfb3fd4844b58c7d6634eebb not found: ID does not exist" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.545662 4745 scope.go:117] "RemoveContainer" containerID="9331827add7fc120aedf6b88798a58ce524ffc136ae60e9e6ba61e68c91d8281" Dec 09 11:58:16 crc kubenswrapper[4745]: E1209 11:58:16.546186 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9331827add7fc120aedf6b88798a58ce524ffc136ae60e9e6ba61e68c91d8281\": container with ID starting with 9331827add7fc120aedf6b88798a58ce524ffc136ae60e9e6ba61e68c91d8281 not found: ID does not exist" containerID="9331827add7fc120aedf6b88798a58ce524ffc136ae60e9e6ba61e68c91d8281" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.546223 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9331827add7fc120aedf6b88798a58ce524ffc136ae60e9e6ba61e68c91d8281"} err="failed to get container status \"9331827add7fc120aedf6b88798a58ce524ffc136ae60e9e6ba61e68c91d8281\": rpc error: code = NotFound desc = could not find container \"9331827add7fc120aedf6b88798a58ce524ffc136ae60e9e6ba61e68c91d8281\": container with ID starting with 9331827add7fc120aedf6b88798a58ce524ffc136ae60e9e6ba61e68c91d8281 not found: ID does not exist" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.546242 4745 scope.go:117] "RemoveContainer" containerID="949bb722de609413bb37a922ca3be73fb6c7a4916d0fe96c8b8b977d069ac462" Dec 09 11:58:16 crc kubenswrapper[4745]: E1209 11:58:16.546503 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"949bb722de609413bb37a922ca3be73fb6c7a4916d0fe96c8b8b977d069ac462\": container with ID starting with 949bb722de609413bb37a922ca3be73fb6c7a4916d0fe96c8b8b977d069ac462 not found: ID does not exist" containerID="949bb722de609413bb37a922ca3be73fb6c7a4916d0fe96c8b8b977d069ac462" Dec 09 11:58:16 crc kubenswrapper[4745]: I1209 11:58:16.546551 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"949bb722de609413bb37a922ca3be73fb6c7a4916d0fe96c8b8b977d069ac462"} err="failed to get container status \"949bb722de609413bb37a922ca3be73fb6c7a4916d0fe96c8b8b977d069ac462\": rpc error: code = NotFound desc = could not find container \"949bb722de609413bb37a922ca3be73fb6c7a4916d0fe96c8b8b977d069ac462\": container with ID starting with 949bb722de609413bb37a922ca3be73fb6c7a4916d0fe96c8b8b977d069ac462 not found: ID does not exist" Dec 09 11:58:17 crc kubenswrapper[4745]: I1209 11:58:17.554614 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 11:58:17 crc kubenswrapper[4745]: E1209 11:58:17.555005 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 11:58:17 crc kubenswrapper[4745]: I1209 11:58:17.564965 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" path="/var/lib/kubelet/pods/71ebc86b-4ef3-4d3f-911f-93036c9ce19b/volumes" Dec 09 11:58:17 crc kubenswrapper[4745]: I1209 11:58:17.567523 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" path="/var/lib/kubelet/pods/c38b2a61-5161-4132-be1a-65e25531e73a/volumes" Dec 09 11:58:17 crc kubenswrapper[4745]: I1209 11:58:17.768232 4745 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podac238b52-4167-4847-b66f-6985b784268c"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podac238b52-4167-4847-b66f-6985b784268c] : Timed out while waiting for systemd to remove kubepods-besteffort-podac238b52_4167_4847_b66f_6985b784268c.slice" Dec 09 11:58:17 crc kubenswrapper[4745]: E1209 11:58:17.768313 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podac238b52-4167-4847-b66f-6985b784268c] : unable to destroy cgroup paths for cgroup [kubepods besteffort podac238b52-4167-4847-b66f-6985b784268c] : Timed out while waiting for systemd to remove kubepods-besteffort-podac238b52_4167_4847_b66f_6985b784268c.slice" pod="openstack/openstackclient" podUID="ac238b52-4167-4847-b66f-6985b784268c" Dec 09 11:58:18 crc kubenswrapper[4745]: I1209 11:58:18.101604 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 11:58:21 crc kubenswrapper[4745]: E1209 11:58:21.886341 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7755562_a5b1_4b4d_833c_5a3179a2926c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda506f944_5b99_48af_a714_e24782ba1c06.slice/crio-75f05b692bee59a7c8306779305f38ca46ce05f0854a343e4dc5681433e2f1d8\": RecentStats: unable to find data in memory cache]" Dec 09 11:58:32 crc kubenswrapper[4745]: E1209 11:58:32.110394 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda506f944_5b99_48af_a714_e24782ba1c06.slice/crio-75f05b692bee59a7c8306779305f38ca46ce05f0854a343e4dc5681433e2f1d8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7755562_a5b1_4b4d_833c_5a3179a2926c.slice\": RecentStats: unable to find data in memory cache]" Dec 09 11:58:32 crc kubenswrapper[4745]: I1209 11:58:32.555357 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 11:58:32 crc kubenswrapper[4745]: E1209 11:58:32.556250 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 11:58:42 crc kubenswrapper[4745]: E1209 11:58:42.312805 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7755562_a5b1_4b4d_833c_5a3179a2926c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda506f944_5b99_48af_a714_e24782ba1c06.slice/crio-75f05b692bee59a7c8306779305f38ca46ce05f0854a343e4dc5681433e2f1d8\": RecentStats: unable to find data in memory cache]" Dec 09 11:58:44 crc kubenswrapper[4745]: I1209 11:58:44.554931 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 11:58:44 crc kubenswrapper[4745]: E1209 11:58:44.555905 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 11:58:52 crc kubenswrapper[4745]: E1209 11:58:52.498001 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7755562_a5b1_4b4d_833c_5a3179a2926c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda506f944_5b99_48af_a714_e24782ba1c06.slice/crio-75f05b692bee59a7c8306779305f38ca46ce05f0854a343e4dc5681433e2f1d8\": RecentStats: unable to find data in memory cache]" Dec 09 11:58:55 crc kubenswrapper[4745]: I1209 11:58:55.554662 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 11:58:55 crc kubenswrapper[4745]: E1209 11:58:55.555149 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 11:59:02 crc kubenswrapper[4745]: I1209 11:59:02.699716 4745 scope.go:117] "RemoveContainer" containerID="809fbd96612b450289eed7e2404a3b4187004e68892406d748f16d6cbe48d42c" Dec 09 11:59:02 crc kubenswrapper[4745]: I1209 11:59:02.727026 4745 scope.go:117] "RemoveContainer" containerID="336ca4940dee1ff5df0505b50698efc59baa43ad9f49067d6cb89f54fb4ca9ca" Dec 09 11:59:02 crc kubenswrapper[4745]: I1209 11:59:02.756622 4745 scope.go:117] "RemoveContainer" containerID="0637d530cebe187aef48d0bff50d3bd185303e1bb8780e94e67215ee199f6ccf" Dec 09 11:59:02 crc kubenswrapper[4745]: I1209 11:59:02.797195 4745 scope.go:117] "RemoveContainer" containerID="4571aa0b6c65a337f5e02c0325f69e46bb7079de7dd610e205c196608f5ddffb" Dec 09 11:59:02 crc kubenswrapper[4745]: I1209 11:59:02.813202 4745 scope.go:117] "RemoveContainer" containerID="e0476a986f9634758ad0014851f819c295e0cb2787e1d0cd344530e1064f8968" Dec 09 11:59:02 crc kubenswrapper[4745]: I1209 11:59:02.846637 4745 scope.go:117] "RemoveContainer" containerID="5698e77aa1e57119afef2f0a2e0f754cc78f10fd79added5c52ec7c05881d368" Dec 09 11:59:02 crc kubenswrapper[4745]: I1209 11:59:02.872346 4745 scope.go:117] "RemoveContainer" containerID="a86fa369905db0b061fcf40232073cd840b343f60d8f376f143ad0c494a0513f" Dec 09 11:59:02 crc kubenswrapper[4745]: I1209 11:59:02.889252 4745 scope.go:117] "RemoveContainer" containerID="3bd3624dff82e436388a8bce15114930fc110be43138332c217c6c5dc788a655" Dec 09 11:59:02 crc kubenswrapper[4745]: I1209 11:59:02.910899 4745 scope.go:117] "RemoveContainer" containerID="0a4d00aad6ad2329c45c9e731e08b417c52114d065594d61af1d66b9e7473f8e" Dec 09 11:59:10 crc kubenswrapper[4745]: I1209 11:59:10.555029 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 11:59:10 crc kubenswrapper[4745]: E1209 11:59:10.555733 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 11:59:21 crc kubenswrapper[4745]: I1209 11:59:21.555087 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 11:59:21 crc kubenswrapper[4745]: E1209 11:59:21.555831 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 11:59:33 crc kubenswrapper[4745]: I1209 11:59:33.558209 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 11:59:33 crc kubenswrapper[4745]: E1209 11:59:33.558916 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 11:59:47 crc kubenswrapper[4745]: I1209 11:59:47.555242 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 11:59:47 crc kubenswrapper[4745]: E1209 11:59:47.556647 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.160576 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh"] Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.161673 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a868ce-c1ab-457a-bd7f-224f8e982a13" containerName="cinder-api" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.161702 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a868ce-c1ab-457a-bd7f-224f8e982a13" containerName="cinder-api" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.161724 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f786d16-8a6e-420b-b2b7-f785386e2191" containerName="nova-cell1-conductor-conductor" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.161734 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f786d16-8a6e-420b-b2b7-f785386e2191" containerName="nova-cell1-conductor-conductor" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.161747 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerName="ceilometer-central-agent" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.161757 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerName="ceilometer-central-agent" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.161765 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058e6f79-b92b-47d3-97ae-d588fec5efcd" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.161773 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="058e6f79-b92b-47d3-97ae-d588fec5efcd" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.161787 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="account-auditor" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.161795 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="account-auditor" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.161822 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88802adc-d164-420b-98d2-a757b6627350" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.161829 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="88802adc-d164-420b-98d2-a757b6627350" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.161841 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="container-replicator" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.161849 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="container-replicator" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.161858 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d65ea6-5ea0-44fa-a4ab-82297d975a87" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.161886 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d65ea6-5ea0-44fa-a4ab-82297d975a87" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.161895 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73ed8df4-e28f-4c76-bca1-a3d77ef789d4" containerName="glance-log" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.161903 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ed8df4-e28f-4c76-bca1-a3d77ef789d4" containerName="glance-log" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.161917 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="600c553e-f8e5-4ec6-94e7-2981abc748cb" containerName="barbican-api-log" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.161925 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="600c553e-f8e5-4ec6-94e7-2981abc748cb" containerName="barbican-api-log" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.161940 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="account-replicator" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.161947 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="account-replicator" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.161959 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerName="proxy-httpd" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.161966 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerName="proxy-httpd" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.161979 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="container-auditor" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.161986 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="container-auditor" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.161998 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d388089-75a9-4e64-8fcf-575fde454708" containerName="nova-cell0-conductor-conductor" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162005 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d388089-75a9-4e64-8fcf-575fde454708" containerName="nova-cell0-conductor-conductor" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162017 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a506f944-5b99-48af-a714-e24782ba1c06" containerName="kube-state-metrics" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162025 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="a506f944-5b99-48af-a714-e24782ba1c06" containerName="kube-state-metrics" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162038 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerName="ceilometer-notification-agent" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162046 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerName="ceilometer-notification-agent" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162056 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c180eac0-e93f-4067-ba6d-32a023f424e6" containerName="nova-metadata-metadata" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162063 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c180eac0-e93f-4067-ba6d-32a023f424e6" containerName="nova-metadata-metadata" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162078 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-auditor" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162085 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-auditor" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162094 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f069021c-4758-4a29-98a5-2952a693cef9" containerName="neutron-api" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162101 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f069021c-4758-4a29-98a5-2952a693cef9" containerName="neutron-api" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162112 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovsdb-server" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162119 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovsdb-server" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162131 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-updater" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162139 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-updater" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162153 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceba626e-26d1-495f-b88d-fed69e445ddb" containerName="rabbitmq" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162160 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceba626e-26d1-495f-b88d-fed69e445ddb" containerName="rabbitmq" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162172 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b860955-30eb-40e6-bd56-caf6098aed8a" containerName="rabbitmq" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162179 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b860955-30eb-40e6-bd56-caf6098aed8a" containerName="rabbitmq" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162188 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b87df09-ea26-4c97-bcd6-4ee7c6250d00" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162195 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b87df09-ea26-4c97-bcd6-4ee7c6250d00" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162202 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="swift-recon-cron" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162209 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="swift-recon-cron" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162222 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e009b8-8eec-4028-ade9-84bc49d236c8" containerName="nova-api-api" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162230 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e009b8-8eec-4028-ade9-84bc49d236c8" containerName="nova-api-api" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162239 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73ed8df4-e28f-4c76-bca1-a3d77ef789d4" containerName="glance-httpd" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162246 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ed8df4-e28f-4c76-bca1-a3d77ef789d4" containerName="glance-httpd" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162258 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="account-server" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162266 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="account-server" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162275 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovsdb-server-init" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162285 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovsdb-server-init" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162294 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9f661d-4261-4e54-883d-cb0e7479a3d2" containerName="openstack-network-exporter" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162302 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9f661d-4261-4e54-883d-cb0e7479a3d2" containerName="openstack-network-exporter" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162313 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerName="sg-core" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162320 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerName="sg-core" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162332 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca899aa-fd10-43e0-ae58-a3a4ae2a4066" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162339 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca899aa-fd10-43e0-ae58-a3a4ae2a4066" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162352 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-replicator" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162359 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-replicator" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162369 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e822e9c0-d6fa-4880-a0e3-8dfb32405a6f" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162375 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e822e9c0-d6fa-4880-a0e3-8dfb32405a6f" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162389 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a3f188-f451-4895-b500-52a9f7877d00" containerName="galera" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162396 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a3f188-f451-4895-b500-52a9f7877d00" containerName="galera" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162404 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4658eac8-46b0-448b-8bc7-7c783fcef1c6" containerName="barbican-keystone-listener-log" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162411 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4658eac8-46b0-448b-8bc7-7c783fcef1c6" containerName="barbican-keystone-listener-log" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162421 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="rsync" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162430 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="rsync" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162441 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da87c5d-f709-4d9b-b182-421edbb61f00" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162448 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da87c5d-f709-4d9b-b182-421edbb61f00" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162457 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaef3c48-5e7c-4ea3-a2d0-da44ea528455" containerName="probe" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162464 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaef3c48-5e7c-4ea3-a2d0-da44ea528455" containerName="probe" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162477 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff59337d-f366-446b-9752-eb371ee468e4" containerName="keystone-api" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162485 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff59337d-f366-446b-9752-eb371ee468e4" containerName="keystone-api" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162495 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaef3c48-5e7c-4ea3-a2d0-da44ea528455" containerName="cinder-scheduler" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162502 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaef3c48-5e7c-4ea3-a2d0-da44ea528455" containerName="cinder-scheduler" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162531 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a868ce-c1ab-457a-bd7f-224f8e982a13" containerName="cinder-api-log" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162539 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a868ce-c1ab-457a-bd7f-224f8e982a13" containerName="cinder-api-log" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162548 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="account-reaper" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162555 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="account-reaper" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162563 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceba626e-26d1-495f-b88d-fed69e445ddb" containerName="setup-container" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162571 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceba626e-26d1-495f-b88d-fed69e445ddb" containerName="setup-container" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162579 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da94483d-f361-42ef-95b4-d4b2c79b4d80" containerName="barbican-worker" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162586 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="da94483d-f361-42ef-95b4-d4b2c79b4d80" containerName="barbican-worker" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162594 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e009b8-8eec-4028-ade9-84bc49d236c8" containerName="nova-api-log" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162601 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e009b8-8eec-4028-ade9-84bc49d236c8" containerName="nova-api-log" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162614 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a3f188-f451-4895-b500-52a9f7877d00" containerName="mysql-bootstrap" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162620 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a3f188-f451-4895-b500-52a9f7877d00" containerName="mysql-bootstrap" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162629 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ab78e7-7419-4892-92c0-085db552be56" containerName="memcached" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162636 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ab78e7-7419-4892-92c0-085db552be56" containerName="memcached" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162646 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="600c553e-f8e5-4ec6-94e7-2981abc748cb" containerName="barbican-api" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162654 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="600c553e-f8e5-4ec6-94e7-2981abc748cb" containerName="barbican-api" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162667 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovs-vswitchd" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162674 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovs-vswitchd" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162683 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4658eac8-46b0-448b-8bc7-7c783fcef1c6" containerName="barbican-keystone-listener" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162690 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4658eac8-46b0-448b-8bc7-7c783fcef1c6" containerName="barbican-keystone-listener" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162702 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392b878a-37ba-4887-a699-672c8b92e947" containerName="nova-scheduler-scheduler" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162709 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="392b878a-37ba-4887-a699-672c8b92e947" containerName="nova-scheduler-scheduler" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162722 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="container-updater" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162728 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="container-updater" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162736 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-expirer" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162742 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-expirer" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162750 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b860955-30eb-40e6-bd56-caf6098aed8a" containerName="setup-container" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162757 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b860955-30eb-40e6-bd56-caf6098aed8a" containerName="setup-container" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162764 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9f661d-4261-4e54-883d-cb0e7479a3d2" containerName="ovn-northd" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162770 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9f661d-4261-4e54-883d-cb0e7479a3d2" containerName="ovn-northd" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162780 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f069021c-4758-4a29-98a5-2952a693cef9" containerName="neutron-httpd" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162786 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f069021c-4758-4a29-98a5-2952a693cef9" containerName="neutron-httpd" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162793 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-server" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162801 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-server" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162809 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c180eac0-e93f-4067-ba6d-32a023f424e6" containerName="nova-metadata-log" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162816 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c180eac0-e93f-4067-ba6d-32a023f424e6" containerName="nova-metadata-log" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162826 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="container-server" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162833 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="container-server" Dec 09 12:00:00 crc kubenswrapper[4745]: E1209 12:00:00.162842 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da94483d-f361-42ef-95b4-d4b2c79b4d80" containerName="barbican-worker-log" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.162850 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="da94483d-f361-42ef-95b4-d4b2c79b4d80" containerName="barbican-worker-log" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163060 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b87df09-ea26-4c97-bcd6-4ee7c6250d00" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163081 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da87c5d-f709-4d9b-b182-421edbb61f00" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163092 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="4658eac8-46b0-448b-8bc7-7c783fcef1c6" containerName="barbican-keystone-listener" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163103 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c180eac0-e93f-4067-ba6d-32a023f424e6" containerName="nova-metadata-metadata" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163110 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d65ea6-5ea0-44fa-a4ab-82297d975a87" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163126 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9f661d-4261-4e54-883d-cb0e7479a3d2" containerName="openstack-network-exporter" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163135 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="058e6f79-b92b-47d3-97ae-d588fec5efcd" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163145 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="account-replicator" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163152 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="392b878a-37ba-4887-a699-672c8b92e947" containerName="nova-scheduler-scheduler" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163161 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="84ab78e7-7419-4892-92c0-085db552be56" containerName="memcached" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163167 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovs-vswitchd" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163176 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b860955-30eb-40e6-bd56-caf6098aed8a" containerName="rabbitmq" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163189 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="account-auditor" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163201 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-expirer" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163214 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerName="proxy-httpd" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163228 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f786d16-8a6e-420b-b2b7-f785386e2191" containerName="nova-cell1-conductor-conductor" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163240 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f069021c-4758-4a29-98a5-2952a693cef9" containerName="neutron-httpd" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163252 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="container-auditor" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163263 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="73ed8df4-e28f-4c76-bca1-a3d77ef789d4" containerName="glance-log" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163271 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerName="ceilometer-central-agent" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163283 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a3f188-f451-4895-b500-52a9f7877d00" containerName="galera" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163295 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ca899aa-fd10-43e0-ae58-a3a4ae2a4066" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163303 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff59337d-f366-446b-9752-eb371ee468e4" containerName="keystone-api" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163312 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c38b2a61-5161-4132-be1a-65e25531e73a" containerName="ovsdb-server" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163319 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c180eac0-e93f-4067-ba6d-32a023f424e6" containerName="nova-metadata-log" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163329 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaef3c48-5e7c-4ea3-a2d0-da44ea528455" containerName="probe" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163338 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceba626e-26d1-495f-b88d-fed69e445ddb" containerName="rabbitmq" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163347 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerName="ceilometer-notification-agent" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163355 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9f661d-4261-4e54-883d-cb0e7479a3d2" containerName="ovn-northd" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163366 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f069021c-4758-4a29-98a5-2952a693cef9" containerName="neutron-api" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163375 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="rsync" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163386 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-auditor" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163397 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="48a868ce-c1ab-457a-bd7f-224f8e982a13" containerName="cinder-api-log" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163405 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="4658eac8-46b0-448b-8bc7-7c783fcef1c6" containerName="barbican-keystone-listener-log" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163414 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="container-updater" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163426 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="a506f944-5b99-48af-a714-e24782ba1c06" containerName="kube-state-metrics" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163437 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d93c20-efbc-41c3-bea1-7e7dad1ae70d" containerName="sg-core" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163446 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="container-server" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163453 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e009b8-8eec-4028-ade9-84bc49d236c8" containerName="nova-api-log" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163464 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="swift-recon-cron" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163473 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d388089-75a9-4e64-8fcf-575fde454708" containerName="nova-cell0-conductor-conductor" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163482 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e009b8-8eec-4028-ade9-84bc49d236c8" containerName="nova-api-api" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163490 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e822e9c0-d6fa-4880-a0e3-8dfb32405a6f" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163576 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="600c553e-f8e5-4ec6-94e7-2981abc748cb" containerName="barbican-api" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163590 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="container-replicator" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163624 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-server" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163639 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="73ed8df4-e28f-4c76-bca1-a3d77ef789d4" containerName="glance-httpd" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163648 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="48a868ce-c1ab-457a-bd7f-224f8e982a13" containerName="cinder-api" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163661 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="account-server" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163670 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="da94483d-f361-42ef-95b4-d4b2c79b4d80" containerName="barbican-worker-log" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163683 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-updater" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163693 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="account-reaper" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163706 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ebc86b-4ef3-4d3f-911f-93036c9ce19b" containerName="object-replicator" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163713 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="600c553e-f8e5-4ec6-94e7-2981abc748cb" containerName="barbican-api-log" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163724 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="88802adc-d164-420b-98d2-a757b6627350" containerName="mariadb-account-delete" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163733 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaef3c48-5e7c-4ea3-a2d0-da44ea528455" containerName="cinder-scheduler" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.163743 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="da94483d-f361-42ef-95b4-d4b2c79b4d80" containerName="barbican-worker" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.164432 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.166953 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.167405 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.174785 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh"] Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.365183 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-secret-volume\") pod \"collect-profiles-29421360-4t5hh\" (UID: \"727fb6c9-9336-4c44-8ad9-d44cd6d6da60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.365597 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vvfb\" (UniqueName: \"kubernetes.io/projected/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-kube-api-access-9vvfb\") pod \"collect-profiles-29421360-4t5hh\" (UID: \"727fb6c9-9336-4c44-8ad9-d44cd6d6da60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.365826 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-config-volume\") pod \"collect-profiles-29421360-4t5hh\" (UID: \"727fb6c9-9336-4c44-8ad9-d44cd6d6da60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.466564 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vvfb\" (UniqueName: \"kubernetes.io/projected/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-kube-api-access-9vvfb\") pod \"collect-profiles-29421360-4t5hh\" (UID: \"727fb6c9-9336-4c44-8ad9-d44cd6d6da60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.466703 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-config-volume\") pod \"collect-profiles-29421360-4t5hh\" (UID: \"727fb6c9-9336-4c44-8ad9-d44cd6d6da60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.466803 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-secret-volume\") pod \"collect-profiles-29421360-4t5hh\" (UID: \"727fb6c9-9336-4c44-8ad9-d44cd6d6da60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.468737 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-config-volume\") pod \"collect-profiles-29421360-4t5hh\" (UID: \"727fb6c9-9336-4c44-8ad9-d44cd6d6da60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.473639 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-secret-volume\") pod \"collect-profiles-29421360-4t5hh\" (UID: \"727fb6c9-9336-4c44-8ad9-d44cd6d6da60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.489729 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vvfb\" (UniqueName: \"kubernetes.io/projected/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-kube-api-access-9vvfb\") pod \"collect-profiles-29421360-4t5hh\" (UID: \"727fb6c9-9336-4c44-8ad9-d44cd6d6da60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.492619 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" Dec 09 12:00:00 crc kubenswrapper[4745]: I1209 12:00:00.942487 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh"] Dec 09 12:00:01 crc kubenswrapper[4745]: I1209 12:00:01.187111 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" event={"ID":"727fb6c9-9336-4c44-8ad9-d44cd6d6da60","Type":"ContainerStarted","Data":"b527afeda634a1cf415e0da35f7a7204a71c2a4a0a2f1c3a308b81fcea7a0e35"} Dec 09 12:00:01 crc kubenswrapper[4745]: I1209 12:00:01.187573 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" event={"ID":"727fb6c9-9336-4c44-8ad9-d44cd6d6da60","Type":"ContainerStarted","Data":"7a1db114c0118b66cc60287dd65ddacb697911c4ccf72348119343a433013921"} Dec 09 12:00:01 crc kubenswrapper[4745]: I1209 12:00:01.208143 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" podStartSLOduration=1.2081151270000001 podStartE2EDuration="1.208115127s" podCreationTimestamp="2025-12-09 12:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 12:00:01.20156869 +0000 UTC m=+1688.026770224" watchObservedRunningTime="2025-12-09 12:00:01.208115127 +0000 UTC m=+1688.033316651" Dec 09 12:00:02 crc kubenswrapper[4745]: I1209 12:00:02.202398 4745 generic.go:334] "Generic (PLEG): container finished" podID="727fb6c9-9336-4c44-8ad9-d44cd6d6da60" containerID="b527afeda634a1cf415e0da35f7a7204a71c2a4a0a2f1c3a308b81fcea7a0e35" exitCode=0 Dec 09 12:00:02 crc kubenswrapper[4745]: I1209 12:00:02.202495 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" event={"ID":"727fb6c9-9336-4c44-8ad9-d44cd6d6da60","Type":"ContainerDied","Data":"b527afeda634a1cf415e0da35f7a7204a71c2a4a0a2f1c3a308b81fcea7a0e35"} Dec 09 12:00:02 crc kubenswrapper[4745]: I1209 12:00:02.555498 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 12:00:02 crc kubenswrapper[4745]: E1209 12:00:02.555902 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.094293 4745 scope.go:117] "RemoveContainer" containerID="005bbf69bfe8e201237c0b32380a847d1254e598d36321c440e5530aef3a86f2" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.115663 4745 scope.go:117] "RemoveContainer" containerID="67cf2989b9dc0e9ab08763199525684b72ba2bee0348375259236deaa1092c59" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.162590 4745 scope.go:117] "RemoveContainer" containerID="b9e81156e03227008305a4610edbd9b02d41542054e6710a627d02509756b16f" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.182321 4745 scope.go:117] "RemoveContainer" containerID="1f74f4a21b2673333af47e87beb31b19b062f5c33852b85486c3138b937f1825" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.229076 4745 scope.go:117] "RemoveContainer" containerID="0885f35be48b9edafa3024a22995fb186c052c7538483ed9d73e519b4bd4b0f3" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.260457 4745 scope.go:117] "RemoveContainer" containerID="416bc8400e98805bbceaa9c931f9fca8fd243adefdbab5e6704f409c75c19e64" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.286711 4745 scope.go:117] "RemoveContainer" containerID="e1b7bff6592b36237dc72ae840ee30fdfd4ffc88f26ccd26119c13ab09e03659" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.307835 4745 scope.go:117] "RemoveContainer" containerID="220c02beca474e41673d6888c903ca349368d7873a469bd6de0ce7dd666e8614" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.348240 4745 scope.go:117] "RemoveContainer" containerID="4aace1f8289737b036612bc10e88c7ddf6bde91fa3445accdbdcc74188e82c74" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.396169 4745 scope.go:117] "RemoveContainer" containerID="483597a5196d57f3f4f001d30d0b02bc3b98e998a978add42e621138a1dbee7e" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.432370 4745 scope.go:117] "RemoveContainer" containerID="b605e8f019c64255474366b2258f98198c1a71d83e50c329de38b59dbe8bf725" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.453857 4745 scope.go:117] "RemoveContainer" containerID="c1d791100af6e66bae867dd47c4c3aa0a99fe52177c4a48d7d7605c2a1a340ad" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.471615 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.552316 4745 scope.go:117] "RemoveContainer" containerID="e8813e68ec5c04a5eca84f32495694cece28ddc0c160e62ff127bb4da6d2300f" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.613294 4745 scope.go:117] "RemoveContainer" containerID="3e03148e4090f73a824b767293e49cf1ab1feefbb029d6cb1a8a2eacdd6d58e1" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.624378 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-secret-volume\") pod \"727fb6c9-9336-4c44-8ad9-d44cd6d6da60\" (UID: \"727fb6c9-9336-4c44-8ad9-d44cd6d6da60\") " Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.625220 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vvfb\" (UniqueName: \"kubernetes.io/projected/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-kube-api-access-9vvfb\") pod \"727fb6c9-9336-4c44-8ad9-d44cd6d6da60\" (UID: \"727fb6c9-9336-4c44-8ad9-d44cd6d6da60\") " Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.625279 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-config-volume\") pod \"727fb6c9-9336-4c44-8ad9-d44cd6d6da60\" (UID: \"727fb6c9-9336-4c44-8ad9-d44cd6d6da60\") " Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.630236 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-config-volume" (OuterVolumeSpecName: "config-volume") pod "727fb6c9-9336-4c44-8ad9-d44cd6d6da60" (UID: "727fb6c9-9336-4c44-8ad9-d44cd6d6da60"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.644115 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "727fb6c9-9336-4c44-8ad9-d44cd6d6da60" (UID: "727fb6c9-9336-4c44-8ad9-d44cd6d6da60"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.644865 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-kube-api-access-9vvfb" (OuterVolumeSpecName: "kube-api-access-9vvfb") pod "727fb6c9-9336-4c44-8ad9-d44cd6d6da60" (UID: "727fb6c9-9336-4c44-8ad9-d44cd6d6da60"). InnerVolumeSpecName "kube-api-access-9vvfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.729912 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vvfb\" (UniqueName: \"kubernetes.io/projected/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-kube-api-access-9vvfb\") on node \"crc\" DevicePath \"\"" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.729944 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 12:00:03 crc kubenswrapper[4745]: I1209 12:00:03.729954 4745 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/727fb6c9-9336-4c44-8ad9-d44cd6d6da60-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 12:00:04 crc kubenswrapper[4745]: I1209 12:00:04.291190 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" event={"ID":"727fb6c9-9336-4c44-8ad9-d44cd6d6da60","Type":"ContainerDied","Data":"7a1db114c0118b66cc60287dd65ddacb697911c4ccf72348119343a433013921"} Dec 09 12:00:04 crc kubenswrapper[4745]: I1209 12:00:04.291253 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a1db114c0118b66cc60287dd65ddacb697911c4ccf72348119343a433013921" Dec 09 12:00:04 crc kubenswrapper[4745]: I1209 12:00:04.291372 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh" Dec 09 12:00:15 crc kubenswrapper[4745]: I1209 12:00:15.556809 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 12:00:15 crc kubenswrapper[4745]: E1209 12:00:15.559985 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:00:30 crc kubenswrapper[4745]: I1209 12:00:30.555337 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 12:00:30 crc kubenswrapper[4745]: E1209 12:00:30.556168 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:00:44 crc kubenswrapper[4745]: I1209 12:00:44.554622 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 12:00:44 crc kubenswrapper[4745]: E1209 12:00:44.555718 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:00:57 crc kubenswrapper[4745]: I1209 12:00:57.555135 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 12:00:57 crc kubenswrapper[4745]: E1209 12:00:57.556378 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:01:03 crc kubenswrapper[4745]: I1209 12:01:03.938285 4745 scope.go:117] "RemoveContainer" containerID="47f1559d229361b96fe0ec92e8bff823067257e88ef088fa96fb0e9ea34b38bc" Dec 09 12:01:04 crc kubenswrapper[4745]: I1209 12:01:04.002607 4745 scope.go:117] "RemoveContainer" containerID="490ddfd6330b9b3002fed97b1c7f9089c9bcf9d6419e599757b54a76e8d3e14a" Dec 09 12:01:04 crc kubenswrapper[4745]: I1209 12:01:04.026245 4745 scope.go:117] "RemoveContainer" containerID="231ca82fbfa5fe8bcf14cb9fe6ec420a3371909539a49a3cc7e48ff962566999" Dec 09 12:01:04 crc kubenswrapper[4745]: I1209 12:01:04.061859 4745 scope.go:117] "RemoveContainer" containerID="0bbc9404782fc19ece5c2c4a2c4e395165e19b288eba5d996aad423d039b8dbe" Dec 09 12:01:04 crc kubenswrapper[4745]: I1209 12:01:04.107909 4745 scope.go:117] "RemoveContainer" containerID="697de4a9fded904a811d04612b2ec365c4c14861fd0b1d3d145deddded6bd815" Dec 09 12:01:04 crc kubenswrapper[4745]: I1209 12:01:04.134470 4745 scope.go:117] "RemoveContainer" containerID="f59d16e21842e09e539e7cb40ecdfd9188979d63816912780cb72eb300e3fd22" Dec 09 12:01:04 crc kubenswrapper[4745]: I1209 12:01:04.158772 4745 scope.go:117] "RemoveContainer" containerID="14f30d7116a31c94310abb53f8936d59624cb617dd766351680c7a5bb720b515" Dec 09 12:01:04 crc kubenswrapper[4745]: I1209 12:01:04.189471 4745 scope.go:117] "RemoveContainer" containerID="09ad1c05b19779977f458ccdf6da3b4f7981fb3e1c55e62d2c5d17dbd9fbfe34" Dec 09 12:01:09 crc kubenswrapper[4745]: I1209 12:01:09.555736 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 12:01:09 crc kubenswrapper[4745]: E1209 12:01:09.564110 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:01:20 crc kubenswrapper[4745]: I1209 12:01:20.555909 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 12:01:20 crc kubenswrapper[4745]: E1209 12:01:20.556737 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:01:31 crc kubenswrapper[4745]: I1209 12:01:31.555753 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 12:01:31 crc kubenswrapper[4745]: E1209 12:01:31.556914 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:01:46 crc kubenswrapper[4745]: I1209 12:01:46.555550 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 12:01:46 crc kubenswrapper[4745]: E1209 12:01:46.556715 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:01:57 crc kubenswrapper[4745]: I1209 12:01:57.555656 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 12:01:57 crc kubenswrapper[4745]: E1209 12:01:57.557137 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:02:04 crc kubenswrapper[4745]: I1209 12:02:04.343657 4745 scope.go:117] "RemoveContainer" containerID="84d0dbf76c2ff2ed4c9dc4cbbf8300c430928bdea45eabd90653242d68969b38" Dec 09 12:02:04 crc kubenswrapper[4745]: I1209 12:02:04.373887 4745 scope.go:117] "RemoveContainer" containerID="91e93b32cad320971d9e0a25ede2bff04e8efbc6587e0a74698dda3692d9f90a" Dec 09 12:02:04 crc kubenswrapper[4745]: I1209 12:02:04.430234 4745 scope.go:117] "RemoveContainer" containerID="71590042ec928a9bfe46dae653af9e69e9269e11cd2b38ede0e4c60102844e9d" Dec 09 12:02:04 crc kubenswrapper[4745]: I1209 12:02:04.471623 4745 scope.go:117] "RemoveContainer" containerID="3cb8e987a7a6b5a17dc3a1ce4421d7a97a43c5bcf0226be502cadc9674ba1fdf" Dec 09 12:02:04 crc kubenswrapper[4745]: I1209 12:02:04.493711 4745 scope.go:117] "RemoveContainer" containerID="38870d77b6d142f1c1cade48e3d884f4d22c05a1003d9c79fc17066cda6fa023" Dec 09 12:02:08 crc kubenswrapper[4745]: I1209 12:02:08.554795 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 12:02:08 crc kubenswrapper[4745]: E1209 12:02:08.555509 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:02:23 crc kubenswrapper[4745]: I1209 12:02:23.564259 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 12:02:23 crc kubenswrapper[4745]: E1209 12:02:23.565224 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:02:27 crc kubenswrapper[4745]: I1209 12:02:27.404791 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wwmrj"] Dec 09 12:02:27 crc kubenswrapper[4745]: E1209 12:02:27.405978 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="727fb6c9-9336-4c44-8ad9-d44cd6d6da60" containerName="collect-profiles" Dec 09 12:02:27 crc kubenswrapper[4745]: I1209 12:02:27.406000 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="727fb6c9-9336-4c44-8ad9-d44cd6d6da60" containerName="collect-profiles" Dec 09 12:02:27 crc kubenswrapper[4745]: I1209 12:02:27.406180 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="727fb6c9-9336-4c44-8ad9-d44cd6d6da60" containerName="collect-profiles" Dec 09 12:02:27 crc kubenswrapper[4745]: I1209 12:02:27.407360 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:27 crc kubenswrapper[4745]: I1209 12:02:27.426918 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwmrj"] Dec 09 12:02:27 crc kubenswrapper[4745]: I1209 12:02:27.579793 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt4qk\" (UniqueName: \"kubernetes.io/projected/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-kube-api-access-wt4qk\") pod \"redhat-marketplace-wwmrj\" (UID: \"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46\") " pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:27 crc kubenswrapper[4745]: I1209 12:02:27.580705 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-catalog-content\") pod \"redhat-marketplace-wwmrj\" (UID: \"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46\") " pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:27 crc kubenswrapper[4745]: I1209 12:02:27.580873 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-utilities\") pod \"redhat-marketplace-wwmrj\" (UID: \"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46\") " pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:27 crc kubenswrapper[4745]: I1209 12:02:27.683572 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt4qk\" (UniqueName: \"kubernetes.io/projected/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-kube-api-access-wt4qk\") pod \"redhat-marketplace-wwmrj\" (UID: \"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46\") " pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:27 crc kubenswrapper[4745]: I1209 12:02:27.683687 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-catalog-content\") pod \"redhat-marketplace-wwmrj\" (UID: \"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46\") " pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:27 crc kubenswrapper[4745]: I1209 12:02:27.683715 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-utilities\") pod \"redhat-marketplace-wwmrj\" (UID: \"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46\") " pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:27 crc kubenswrapper[4745]: I1209 12:02:27.684371 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-utilities\") pod \"redhat-marketplace-wwmrj\" (UID: \"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46\") " pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:27 crc kubenswrapper[4745]: I1209 12:02:27.684594 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-catalog-content\") pod \"redhat-marketplace-wwmrj\" (UID: \"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46\") " pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:27 crc kubenswrapper[4745]: I1209 12:02:27.714824 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt4qk\" (UniqueName: \"kubernetes.io/projected/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-kube-api-access-wt4qk\") pod \"redhat-marketplace-wwmrj\" (UID: \"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46\") " pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:27 crc kubenswrapper[4745]: I1209 12:02:27.805636 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:28 crc kubenswrapper[4745]: I1209 12:02:28.279326 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwmrj"] Dec 09 12:02:28 crc kubenswrapper[4745]: I1209 12:02:28.709999 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwmrj" event={"ID":"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46","Type":"ContainerStarted","Data":"50e65cc5cd210fdbc18a3bfb49537382ed9860eb111a5669025de4c917391c53"} Dec 09 12:02:29 crc kubenswrapper[4745]: I1209 12:02:29.725420 4745 generic.go:334] "Generic (PLEG): container finished" podID="4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46" containerID="753986fc48379f31727333e3b43fe5c9acc9674eec8e5dd1ec53ecb17f755e36" exitCode=0 Dec 09 12:02:29 crc kubenswrapper[4745]: I1209 12:02:29.725487 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwmrj" event={"ID":"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46","Type":"ContainerDied","Data":"753986fc48379f31727333e3b43fe5c9acc9674eec8e5dd1ec53ecb17f755e36"} Dec 09 12:02:29 crc kubenswrapper[4745]: I1209 12:02:29.728924 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:02:30 crc kubenswrapper[4745]: I1209 12:02:30.763548 4745 generic.go:334] "Generic (PLEG): container finished" podID="4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46" containerID="2c7148531255597495b8e2a7135f2138a551cd2ea0e08ce982ca901e41052e37" exitCode=0 Dec 09 12:02:30 crc kubenswrapper[4745]: I1209 12:02:30.763650 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwmrj" event={"ID":"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46","Type":"ContainerDied","Data":"2c7148531255597495b8e2a7135f2138a551cd2ea0e08ce982ca901e41052e37"} Dec 09 12:02:31 crc kubenswrapper[4745]: I1209 12:02:31.780796 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwmrj" event={"ID":"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46","Type":"ContainerStarted","Data":"7d316fef415100762a028939604e3852ddf55e5f775220b94f9740babb58795d"} Dec 09 12:02:31 crc kubenswrapper[4745]: I1209 12:02:31.803872 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wwmrj" podStartSLOduration=3.023898826 podStartE2EDuration="4.803845758s" podCreationTimestamp="2025-12-09 12:02:27 +0000 UTC" firstStartedPulling="2025-12-09 12:02:29.728317818 +0000 UTC m=+1836.553519382" lastFinishedPulling="2025-12-09 12:02:31.50826475 +0000 UTC m=+1838.333466314" observedRunningTime="2025-12-09 12:02:31.801045162 +0000 UTC m=+1838.626246686" watchObservedRunningTime="2025-12-09 12:02:31.803845758 +0000 UTC m=+1838.629047282" Dec 09 12:02:36 crc kubenswrapper[4745]: I1209 12:02:36.555122 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 12:02:37 crc kubenswrapper[4745]: I1209 12:02:37.806883 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:37 crc kubenswrapper[4745]: I1209 12:02:37.807551 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:37 crc kubenswrapper[4745]: I1209 12:02:37.842978 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"80ec805e25be5f9e7a50691b3bc366c0c67b3268413cc1b2a87045ce55eecc37"} Dec 09 12:02:37 crc kubenswrapper[4745]: I1209 12:02:37.879679 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:37 crc kubenswrapper[4745]: I1209 12:02:37.946118 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:38 crc kubenswrapper[4745]: I1209 12:02:38.126042 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwmrj"] Dec 09 12:02:39 crc kubenswrapper[4745]: I1209 12:02:39.941246 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wwmrj" podUID="4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46" containerName="registry-server" containerID="cri-o://7d316fef415100762a028939604e3852ddf55e5f775220b94f9740babb58795d" gracePeriod=2 Dec 09 12:02:40 crc kubenswrapper[4745]: I1209 12:02:40.454367 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:40 crc kubenswrapper[4745]: I1209 12:02:40.596979 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt4qk\" (UniqueName: \"kubernetes.io/projected/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-kube-api-access-wt4qk\") pod \"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46\" (UID: \"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46\") " Dec 09 12:02:40 crc kubenswrapper[4745]: I1209 12:02:40.597505 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-utilities\") pod \"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46\" (UID: \"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46\") " Dec 09 12:02:40 crc kubenswrapper[4745]: I1209 12:02:40.597890 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-catalog-content\") pod \"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46\" (UID: \"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46\") " Dec 09 12:02:40 crc kubenswrapper[4745]: I1209 12:02:40.598640 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-utilities" (OuterVolumeSpecName: "utilities") pod "4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46" (UID: "4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:02:40 crc kubenswrapper[4745]: I1209 12:02:40.598865 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:02:40 crc kubenswrapper[4745]: I1209 12:02:40.612024 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-kube-api-access-wt4qk" (OuterVolumeSpecName: "kube-api-access-wt4qk") pod "4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46" (UID: "4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46"). InnerVolumeSpecName "kube-api-access-wt4qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:02:40 crc kubenswrapper[4745]: I1209 12:02:40.633323 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46" (UID: "4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:02:40 crc kubenswrapper[4745]: I1209 12:02:40.700207 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt4qk\" (UniqueName: \"kubernetes.io/projected/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-kube-api-access-wt4qk\") on node \"crc\" DevicePath \"\"" Dec 09 12:02:40 crc kubenswrapper[4745]: I1209 12:02:40.700254 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:02:40 crc kubenswrapper[4745]: I1209 12:02:40.959225 4745 generic.go:334] "Generic (PLEG): container finished" podID="4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46" containerID="7d316fef415100762a028939604e3852ddf55e5f775220b94f9740babb58795d" exitCode=0 Dec 09 12:02:40 crc kubenswrapper[4745]: I1209 12:02:40.959308 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwmrj" event={"ID":"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46","Type":"ContainerDied","Data":"7d316fef415100762a028939604e3852ddf55e5f775220b94f9740babb58795d"} Dec 09 12:02:40 crc kubenswrapper[4745]: I1209 12:02:40.959369 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwmrj" event={"ID":"4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46","Type":"ContainerDied","Data":"50e65cc5cd210fdbc18a3bfb49537382ed9860eb111a5669025de4c917391c53"} Dec 09 12:02:40 crc kubenswrapper[4745]: I1209 12:02:40.959372 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwmrj" Dec 09 12:02:40 crc kubenswrapper[4745]: I1209 12:02:40.959420 4745 scope.go:117] "RemoveContainer" containerID="7d316fef415100762a028939604e3852ddf55e5f775220b94f9740babb58795d" Dec 09 12:02:40 crc kubenswrapper[4745]: I1209 12:02:40.992226 4745 scope.go:117] "RemoveContainer" containerID="2c7148531255597495b8e2a7135f2138a551cd2ea0e08ce982ca901e41052e37" Dec 09 12:02:41 crc kubenswrapper[4745]: I1209 12:02:41.016761 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwmrj"] Dec 09 12:02:41 crc kubenswrapper[4745]: I1209 12:02:41.022839 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwmrj"] Dec 09 12:02:41 crc kubenswrapper[4745]: I1209 12:02:41.026918 4745 scope.go:117] "RemoveContainer" containerID="753986fc48379f31727333e3b43fe5c9acc9674eec8e5dd1ec53ecb17f755e36" Dec 09 12:02:41 crc kubenswrapper[4745]: I1209 12:02:41.074173 4745 scope.go:117] "RemoveContainer" containerID="7d316fef415100762a028939604e3852ddf55e5f775220b94f9740babb58795d" Dec 09 12:02:41 crc kubenswrapper[4745]: E1209 12:02:41.075021 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d316fef415100762a028939604e3852ddf55e5f775220b94f9740babb58795d\": container with ID starting with 7d316fef415100762a028939604e3852ddf55e5f775220b94f9740babb58795d not found: ID does not exist" containerID="7d316fef415100762a028939604e3852ddf55e5f775220b94f9740babb58795d" Dec 09 12:02:41 crc kubenswrapper[4745]: I1209 12:02:41.075086 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d316fef415100762a028939604e3852ddf55e5f775220b94f9740babb58795d"} err="failed to get container status \"7d316fef415100762a028939604e3852ddf55e5f775220b94f9740babb58795d\": rpc error: code = NotFound desc = could not find container \"7d316fef415100762a028939604e3852ddf55e5f775220b94f9740babb58795d\": container with ID starting with 7d316fef415100762a028939604e3852ddf55e5f775220b94f9740babb58795d not found: ID does not exist" Dec 09 12:02:41 crc kubenswrapper[4745]: I1209 12:02:41.075127 4745 scope.go:117] "RemoveContainer" containerID="2c7148531255597495b8e2a7135f2138a551cd2ea0e08ce982ca901e41052e37" Dec 09 12:02:41 crc kubenswrapper[4745]: E1209 12:02:41.075788 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c7148531255597495b8e2a7135f2138a551cd2ea0e08ce982ca901e41052e37\": container with ID starting with 2c7148531255597495b8e2a7135f2138a551cd2ea0e08ce982ca901e41052e37 not found: ID does not exist" containerID="2c7148531255597495b8e2a7135f2138a551cd2ea0e08ce982ca901e41052e37" Dec 09 12:02:41 crc kubenswrapper[4745]: I1209 12:02:41.075834 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c7148531255597495b8e2a7135f2138a551cd2ea0e08ce982ca901e41052e37"} err="failed to get container status \"2c7148531255597495b8e2a7135f2138a551cd2ea0e08ce982ca901e41052e37\": rpc error: code = NotFound desc = could not find container \"2c7148531255597495b8e2a7135f2138a551cd2ea0e08ce982ca901e41052e37\": container with ID starting with 2c7148531255597495b8e2a7135f2138a551cd2ea0e08ce982ca901e41052e37 not found: ID does not exist" Dec 09 12:02:41 crc kubenswrapper[4745]: I1209 12:02:41.075901 4745 scope.go:117] "RemoveContainer" containerID="753986fc48379f31727333e3b43fe5c9acc9674eec8e5dd1ec53ecb17f755e36" Dec 09 12:02:41 crc kubenswrapper[4745]: E1209 12:02:41.076413 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"753986fc48379f31727333e3b43fe5c9acc9674eec8e5dd1ec53ecb17f755e36\": container with ID starting with 753986fc48379f31727333e3b43fe5c9acc9674eec8e5dd1ec53ecb17f755e36 not found: ID does not exist" containerID="753986fc48379f31727333e3b43fe5c9acc9674eec8e5dd1ec53ecb17f755e36" Dec 09 12:02:41 crc kubenswrapper[4745]: I1209 12:02:41.076542 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753986fc48379f31727333e3b43fe5c9acc9674eec8e5dd1ec53ecb17f755e36"} err="failed to get container status \"753986fc48379f31727333e3b43fe5c9acc9674eec8e5dd1ec53ecb17f755e36\": rpc error: code = NotFound desc = could not find container \"753986fc48379f31727333e3b43fe5c9acc9674eec8e5dd1ec53ecb17f755e36\": container with ID starting with 753986fc48379f31727333e3b43fe5c9acc9674eec8e5dd1ec53ecb17f755e36 not found: ID does not exist" Dec 09 12:02:41 crc kubenswrapper[4745]: I1209 12:02:41.568919 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46" path="/var/lib/kubelet/pods/4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46/volumes" Dec 09 12:03:04 crc kubenswrapper[4745]: I1209 12:03:04.606139 4745 scope.go:117] "RemoveContainer" containerID="d2d724612e33be6891fca27b52fe886988ad84a0f8f9f933dfaa7cdd089d1f93" Dec 09 12:03:04 crc kubenswrapper[4745]: I1209 12:03:04.667825 4745 scope.go:117] "RemoveContainer" containerID="4d0d0764d9f0fe5167222401d27b58294f16f2e351cb70a946140ba51e127793" Dec 09 12:03:04 crc kubenswrapper[4745]: I1209 12:03:04.703641 4745 scope.go:117] "RemoveContainer" containerID="0959a307a96a3c8b986b8df7cc54318c82c7451c29bafc311bb4eafa178c8154" Dec 09 12:03:04 crc kubenswrapper[4745]: I1209 12:03:04.729011 4745 scope.go:117] "RemoveContainer" containerID="2b371925777f6434276ab7a74c14a8e465cf580329fd7735e425f6016bedfb3b" Dec 09 12:03:04 crc kubenswrapper[4745]: I1209 12:03:04.753468 4745 scope.go:117] "RemoveContainer" containerID="54a2edbbc1397cc5ce4a40d2a57015e139f31d6f9f4b30325e3a940d377baabd" Dec 09 12:04:04 crc kubenswrapper[4745]: I1209 12:04:04.844815 4745 scope.go:117] "RemoveContainer" containerID="e0758a906f3658a8bea2c9761871942f25867001c078826c156c644ccceb845e" Dec 09 12:04:04 crc kubenswrapper[4745]: I1209 12:04:04.870152 4745 scope.go:117] "RemoveContainer" containerID="d86765cb79f89a9cbc7de973c711eb1d6f3dd9f41b2b18a025ac97233e7bc3a7" Dec 09 12:04:04 crc kubenswrapper[4745]: I1209 12:04:04.904827 4745 scope.go:117] "RemoveContainer" containerID="292c118e9ea99e79f59b1386147eb09c4d2872305f317b469d80803ac95abb15" Dec 09 12:04:04 crc kubenswrapper[4745]: I1209 12:04:04.938068 4745 scope.go:117] "RemoveContainer" containerID="c4860343d4ee81a640779864c9d8d1bbc93d0c65395dd5b02fec146d0aee40be" Dec 09 12:04:04 crc kubenswrapper[4745]: I1209 12:04:04.962075 4745 scope.go:117] "RemoveContainer" containerID="feb47bcafcf25de8587ef4bb55ae093b5fe21d5f5e5449021a1e056cb1aec196" Dec 09 12:04:04 crc kubenswrapper[4745]: I1209 12:04:04.992454 4745 scope.go:117] "RemoveContainer" containerID="decfdf9a27f78a38287e325560f9246d8ee33368520c37c1e0ab35b37e5ff1ae" Dec 09 12:04:05 crc kubenswrapper[4745]: I1209 12:04:05.012834 4745 scope.go:117] "RemoveContainer" containerID="29b4a184917de6914515f00cc21f359e17c2106e2966f12c01a46b5cfb3327b7" Dec 09 12:04:55 crc kubenswrapper[4745]: I1209 12:04:55.476133 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:04:55 crc kubenswrapper[4745]: I1209 12:04:55.477144 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:05:25 crc kubenswrapper[4745]: I1209 12:05:25.475546 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:05:25 crc kubenswrapper[4745]: I1209 12:05:25.476461 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.665968 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n897c"] Dec 09 12:05:38 crc kubenswrapper[4745]: E1209 12:05:38.673302 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46" containerName="registry-server" Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.673422 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46" containerName="registry-server" Dec 09 12:05:38 crc kubenswrapper[4745]: E1209 12:05:38.673754 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46" containerName="extract-content" Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.673870 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46" containerName="extract-content" Dec 09 12:05:38 crc kubenswrapper[4745]: E1209 12:05:38.673978 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46" containerName="extract-utilities" Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.674059 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46" containerName="extract-utilities" Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.674318 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a5dbc88-3eed-4847-83c2-bbf6cd5f9f46" containerName="registry-server" Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.676485 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.678692 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n897c"] Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.779472 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08220a67-4567-40cf-85b2-4dd0aea2e870-catalog-content\") pod \"redhat-operators-n897c\" (UID: \"08220a67-4567-40cf-85b2-4dd0aea2e870\") " pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.779636 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xk6b\" (UniqueName: \"kubernetes.io/projected/08220a67-4567-40cf-85b2-4dd0aea2e870-kube-api-access-5xk6b\") pod \"redhat-operators-n897c\" (UID: \"08220a67-4567-40cf-85b2-4dd0aea2e870\") " pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.779674 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08220a67-4567-40cf-85b2-4dd0aea2e870-utilities\") pod \"redhat-operators-n897c\" (UID: \"08220a67-4567-40cf-85b2-4dd0aea2e870\") " pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.881404 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xk6b\" (UniqueName: \"kubernetes.io/projected/08220a67-4567-40cf-85b2-4dd0aea2e870-kube-api-access-5xk6b\") pod \"redhat-operators-n897c\" (UID: \"08220a67-4567-40cf-85b2-4dd0aea2e870\") " pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.881475 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08220a67-4567-40cf-85b2-4dd0aea2e870-utilities\") pod \"redhat-operators-n897c\" (UID: \"08220a67-4567-40cf-85b2-4dd0aea2e870\") " pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.881548 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08220a67-4567-40cf-85b2-4dd0aea2e870-catalog-content\") pod \"redhat-operators-n897c\" (UID: \"08220a67-4567-40cf-85b2-4dd0aea2e870\") " pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.882053 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08220a67-4567-40cf-85b2-4dd0aea2e870-catalog-content\") pod \"redhat-operators-n897c\" (UID: \"08220a67-4567-40cf-85b2-4dd0aea2e870\") " pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.882429 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08220a67-4567-40cf-85b2-4dd0aea2e870-utilities\") pod \"redhat-operators-n897c\" (UID: \"08220a67-4567-40cf-85b2-4dd0aea2e870\") " pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.905427 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xk6b\" (UniqueName: \"kubernetes.io/projected/08220a67-4567-40cf-85b2-4dd0aea2e870-kube-api-access-5xk6b\") pod \"redhat-operators-n897c\" (UID: \"08220a67-4567-40cf-85b2-4dd0aea2e870\") " pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:38 crc kubenswrapper[4745]: I1209 12:05:38.999599 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:39 crc kubenswrapper[4745]: I1209 12:05:39.243641 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n897c"] Dec 09 12:05:39 crc kubenswrapper[4745]: I1209 12:05:39.657312 4745 generic.go:334] "Generic (PLEG): container finished" podID="08220a67-4567-40cf-85b2-4dd0aea2e870" containerID="7e32ef35d9234561b663a9c8dbded5e2a41ca499dc7c0f64a6d8035db314b0fc" exitCode=0 Dec 09 12:05:39 crc kubenswrapper[4745]: I1209 12:05:39.657526 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n897c" event={"ID":"08220a67-4567-40cf-85b2-4dd0aea2e870","Type":"ContainerDied","Data":"7e32ef35d9234561b663a9c8dbded5e2a41ca499dc7c0f64a6d8035db314b0fc"} Dec 09 12:05:39 crc kubenswrapper[4745]: I1209 12:05:39.657630 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n897c" event={"ID":"08220a67-4567-40cf-85b2-4dd0aea2e870","Type":"ContainerStarted","Data":"5570c2bda59b32de396118a42d858bdb1ae3c78de0444891f5626d719ad5a6d9"} Dec 09 12:05:41 crc kubenswrapper[4745]: I1209 12:05:41.679674 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n897c" event={"ID":"08220a67-4567-40cf-85b2-4dd0aea2e870","Type":"ContainerStarted","Data":"d8b2239938d3d3131f548d41447f5bbb58cbacd40f835f8d8488ea66803b5021"} Dec 09 12:05:42 crc kubenswrapper[4745]: I1209 12:05:42.689270 4745 generic.go:334] "Generic (PLEG): container finished" podID="08220a67-4567-40cf-85b2-4dd0aea2e870" containerID="d8b2239938d3d3131f548d41447f5bbb58cbacd40f835f8d8488ea66803b5021" exitCode=0 Dec 09 12:05:42 crc kubenswrapper[4745]: I1209 12:05:42.689298 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n897c" event={"ID":"08220a67-4567-40cf-85b2-4dd0aea2e870","Type":"ContainerDied","Data":"d8b2239938d3d3131f548d41447f5bbb58cbacd40f835f8d8488ea66803b5021"} Dec 09 12:05:44 crc kubenswrapper[4745]: I1209 12:05:44.708086 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n897c" event={"ID":"08220a67-4567-40cf-85b2-4dd0aea2e870","Type":"ContainerStarted","Data":"636f36c6405c7d6cd3d3822373fa2019951c55ad8bfe3e0ab298b1f025f8de10"} Dec 09 12:05:44 crc kubenswrapper[4745]: I1209 12:05:44.737748 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n897c" podStartSLOduration=2.558545615 podStartE2EDuration="6.737722882s" podCreationTimestamp="2025-12-09 12:05:38 +0000 UTC" firstStartedPulling="2025-12-09 12:05:39.65919814 +0000 UTC m=+2026.484399674" lastFinishedPulling="2025-12-09 12:05:43.838375407 +0000 UTC m=+2030.663576941" observedRunningTime="2025-12-09 12:05:44.735991625 +0000 UTC m=+2031.561193149" watchObservedRunningTime="2025-12-09 12:05:44.737722882 +0000 UTC m=+2031.562924406" Dec 09 12:05:49 crc kubenswrapper[4745]: I1209 12:05:49.000803 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:49 crc kubenswrapper[4745]: I1209 12:05:49.001131 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:49 crc kubenswrapper[4745]: I1209 12:05:49.046878 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:49 crc kubenswrapper[4745]: I1209 12:05:49.798598 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:49 crc kubenswrapper[4745]: I1209 12:05:49.856212 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n897c"] Dec 09 12:05:51 crc kubenswrapper[4745]: I1209 12:05:51.774851 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n897c" podUID="08220a67-4567-40cf-85b2-4dd0aea2e870" containerName="registry-server" containerID="cri-o://636f36c6405c7d6cd3d3822373fa2019951c55ad8bfe3e0ab298b1f025f8de10" gracePeriod=2 Dec 09 12:05:54 crc kubenswrapper[4745]: I1209 12:05:54.900696 4745 generic.go:334] "Generic (PLEG): container finished" podID="08220a67-4567-40cf-85b2-4dd0aea2e870" containerID="636f36c6405c7d6cd3d3822373fa2019951c55ad8bfe3e0ab298b1f025f8de10" exitCode=0 Dec 09 12:05:54 crc kubenswrapper[4745]: I1209 12:05:54.901334 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n897c" event={"ID":"08220a67-4567-40cf-85b2-4dd0aea2e870","Type":"ContainerDied","Data":"636f36c6405c7d6cd3d3822373fa2019951c55ad8bfe3e0ab298b1f025f8de10"} Dec 09 12:05:54 crc kubenswrapper[4745]: I1209 12:05:54.996836 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.187628 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08220a67-4567-40cf-85b2-4dd0aea2e870-utilities\") pod \"08220a67-4567-40cf-85b2-4dd0aea2e870\" (UID: \"08220a67-4567-40cf-85b2-4dd0aea2e870\") " Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.187772 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08220a67-4567-40cf-85b2-4dd0aea2e870-catalog-content\") pod \"08220a67-4567-40cf-85b2-4dd0aea2e870\" (UID: \"08220a67-4567-40cf-85b2-4dd0aea2e870\") " Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.187906 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xk6b\" (UniqueName: \"kubernetes.io/projected/08220a67-4567-40cf-85b2-4dd0aea2e870-kube-api-access-5xk6b\") pod \"08220a67-4567-40cf-85b2-4dd0aea2e870\" (UID: \"08220a67-4567-40cf-85b2-4dd0aea2e870\") " Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.188725 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08220a67-4567-40cf-85b2-4dd0aea2e870-utilities" (OuterVolumeSpecName: "utilities") pod "08220a67-4567-40cf-85b2-4dd0aea2e870" (UID: "08220a67-4567-40cf-85b2-4dd0aea2e870"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.200693 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08220a67-4567-40cf-85b2-4dd0aea2e870-kube-api-access-5xk6b" (OuterVolumeSpecName: "kube-api-access-5xk6b") pod "08220a67-4567-40cf-85b2-4dd0aea2e870" (UID: "08220a67-4567-40cf-85b2-4dd0aea2e870"). InnerVolumeSpecName "kube-api-access-5xk6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.289993 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08220a67-4567-40cf-85b2-4dd0aea2e870-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.290035 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xk6b\" (UniqueName: \"kubernetes.io/projected/08220a67-4567-40cf-85b2-4dd0aea2e870-kube-api-access-5xk6b\") on node \"crc\" DevicePath \"\"" Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.308891 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08220a67-4567-40cf-85b2-4dd0aea2e870-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08220a67-4567-40cf-85b2-4dd0aea2e870" (UID: "08220a67-4567-40cf-85b2-4dd0aea2e870"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.391460 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08220a67-4567-40cf-85b2-4dd0aea2e870-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.475928 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.475999 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.476058 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.476946 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80ec805e25be5f9e7a50691b3bc366c0c67b3268413cc1b2a87045ce55eecc37"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.477010 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://80ec805e25be5f9e7a50691b3bc366c0c67b3268413cc1b2a87045ce55eecc37" gracePeriod=600 Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.917167 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="80ec805e25be5f9e7a50691b3bc366c0c67b3268413cc1b2a87045ce55eecc37" exitCode=0 Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.917248 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"80ec805e25be5f9e7a50691b3bc366c0c67b3268413cc1b2a87045ce55eecc37"} Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.917565 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc"} Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.917609 4745 scope.go:117] "RemoveContainer" containerID="0639b997cea4ef3235fd3e869d3efbbe11dedf411af4766ca166b24575fb3be3" Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.920932 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n897c" event={"ID":"08220a67-4567-40cf-85b2-4dd0aea2e870","Type":"ContainerDied","Data":"5570c2bda59b32de396118a42d858bdb1ae3c78de0444891f5626d719ad5a6d9"} Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.921038 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n897c" Dec 09 12:05:55 crc kubenswrapper[4745]: I1209 12:05:55.976208 4745 scope.go:117] "RemoveContainer" containerID="636f36c6405c7d6cd3d3822373fa2019951c55ad8bfe3e0ab298b1f025f8de10" Dec 09 12:05:56 crc kubenswrapper[4745]: I1209 12:05:56.001784 4745 scope.go:117] "RemoveContainer" containerID="d8b2239938d3d3131f548d41447f5bbb58cbacd40f835f8d8488ea66803b5021" Dec 09 12:05:56 crc kubenswrapper[4745]: I1209 12:05:56.024743 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n897c"] Dec 09 12:05:56 crc kubenswrapper[4745]: I1209 12:05:56.030062 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n897c"] Dec 09 12:05:56 crc kubenswrapper[4745]: I1209 12:05:56.030868 4745 scope.go:117] "RemoveContainer" containerID="7e32ef35d9234561b663a9c8dbded5e2a41ca499dc7c0f64a6d8035db314b0fc" Dec 09 12:05:57 crc kubenswrapper[4745]: I1209 12:05:57.570429 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08220a67-4567-40cf-85b2-4dd0aea2e870" path="/var/lib/kubelet/pods/08220a67-4567-40cf-85b2-4dd0aea2e870/volumes" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.026423 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hpvcx"] Dec 09 12:07:16 crc kubenswrapper[4745]: E1209 12:07:16.027364 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08220a67-4567-40cf-85b2-4dd0aea2e870" containerName="extract-content" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.027395 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="08220a67-4567-40cf-85b2-4dd0aea2e870" containerName="extract-content" Dec 09 12:07:16 crc kubenswrapper[4745]: E1209 12:07:16.027415 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08220a67-4567-40cf-85b2-4dd0aea2e870" containerName="registry-server" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.027422 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="08220a67-4567-40cf-85b2-4dd0aea2e870" containerName="registry-server" Dec 09 12:07:16 crc kubenswrapper[4745]: E1209 12:07:16.027435 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08220a67-4567-40cf-85b2-4dd0aea2e870" containerName="extract-utilities" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.027443 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="08220a67-4567-40cf-85b2-4dd0aea2e870" containerName="extract-utilities" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.027716 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="08220a67-4567-40cf-85b2-4dd0aea2e870" containerName="registry-server" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.029132 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.040183 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpvcx"] Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.080184 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55pm4\" (UniqueName: \"kubernetes.io/projected/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-kube-api-access-55pm4\") pod \"certified-operators-hpvcx\" (UID: \"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac\") " pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.080284 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-utilities\") pod \"certified-operators-hpvcx\" (UID: \"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac\") " pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.080331 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-catalog-content\") pod \"certified-operators-hpvcx\" (UID: \"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac\") " pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.182314 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-utilities\") pod \"certified-operators-hpvcx\" (UID: \"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac\") " pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.182415 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-catalog-content\") pod \"certified-operators-hpvcx\" (UID: \"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac\") " pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.182469 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55pm4\" (UniqueName: \"kubernetes.io/projected/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-kube-api-access-55pm4\") pod \"certified-operators-hpvcx\" (UID: \"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac\") " pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.183024 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-catalog-content\") pod \"certified-operators-hpvcx\" (UID: \"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac\") " pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.183298 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-utilities\") pod \"certified-operators-hpvcx\" (UID: \"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac\") " pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.204306 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55pm4\" (UniqueName: \"kubernetes.io/projected/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-kube-api-access-55pm4\") pod \"certified-operators-hpvcx\" (UID: \"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac\") " pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.356011 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:16 crc kubenswrapper[4745]: I1209 12:07:16.875116 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpvcx"] Dec 09 12:07:17 crc kubenswrapper[4745]: I1209 12:07:17.633016 4745 generic.go:334] "Generic (PLEG): container finished" podID="7c27c91b-fdb0-4c80-8541-cadbeb30a3ac" containerID="ad8be0d4f38343f0c806da660b852b2fda8d42d8492e2c768ef96d694af0f878" exitCode=0 Dec 09 12:07:17 crc kubenswrapper[4745]: I1209 12:07:17.633216 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpvcx" event={"ID":"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac","Type":"ContainerDied","Data":"ad8be0d4f38343f0c806da660b852b2fda8d42d8492e2c768ef96d694af0f878"} Dec 09 12:07:17 crc kubenswrapper[4745]: I1209 12:07:17.633634 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpvcx" event={"ID":"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac","Type":"ContainerStarted","Data":"2ce73ceedefe2ec2b6cd323b15f4f01ad370c56479f278a61e624ef53a9c9d29"} Dec 09 12:07:18 crc kubenswrapper[4745]: I1209 12:07:18.643029 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpvcx" event={"ID":"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac","Type":"ContainerStarted","Data":"f2db5ac708fd49943b0a9e2a37f595625640832fde56fa32d86ee5bd46617699"} Dec 09 12:07:19 crc kubenswrapper[4745]: I1209 12:07:19.652094 4745 generic.go:334] "Generic (PLEG): container finished" podID="7c27c91b-fdb0-4c80-8541-cadbeb30a3ac" containerID="f2db5ac708fd49943b0a9e2a37f595625640832fde56fa32d86ee5bd46617699" exitCode=0 Dec 09 12:07:19 crc kubenswrapper[4745]: I1209 12:07:19.652154 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpvcx" event={"ID":"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac","Type":"ContainerDied","Data":"f2db5ac708fd49943b0a9e2a37f595625640832fde56fa32d86ee5bd46617699"} Dec 09 12:07:20 crc kubenswrapper[4745]: I1209 12:07:20.662019 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpvcx" event={"ID":"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac","Type":"ContainerStarted","Data":"137e0b4533f19e43671c6d2086e4e3d77d7e6df9ee8dff175bdf51b53c6a16f5"} Dec 09 12:07:20 crc kubenswrapper[4745]: I1209 12:07:20.684333 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hpvcx" podStartSLOduration=2.292554247 podStartE2EDuration="4.684277092s" podCreationTimestamp="2025-12-09 12:07:16 +0000 UTC" firstStartedPulling="2025-12-09 12:07:17.636909093 +0000 UTC m=+2124.462110627" lastFinishedPulling="2025-12-09 12:07:20.028631948 +0000 UTC m=+2126.853833472" observedRunningTime="2025-12-09 12:07:20.679218386 +0000 UTC m=+2127.504419920" watchObservedRunningTime="2025-12-09 12:07:20.684277092 +0000 UTC m=+2127.509478616" Dec 09 12:07:26 crc kubenswrapper[4745]: I1209 12:07:26.356704 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:26 crc kubenswrapper[4745]: I1209 12:07:26.357224 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:26 crc kubenswrapper[4745]: I1209 12:07:26.397353 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:26 crc kubenswrapper[4745]: I1209 12:07:26.745020 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:26 crc kubenswrapper[4745]: I1209 12:07:26.792287 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpvcx"] Dec 09 12:07:28 crc kubenswrapper[4745]: I1209 12:07:28.722096 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hpvcx" podUID="7c27c91b-fdb0-4c80-8541-cadbeb30a3ac" containerName="registry-server" containerID="cri-o://137e0b4533f19e43671c6d2086e4e3d77d7e6df9ee8dff175bdf51b53c6a16f5" gracePeriod=2 Dec 09 12:07:29 crc kubenswrapper[4745]: I1209 12:07:29.731472 4745 generic.go:334] "Generic (PLEG): container finished" podID="7c27c91b-fdb0-4c80-8541-cadbeb30a3ac" containerID="137e0b4533f19e43671c6d2086e4e3d77d7e6df9ee8dff175bdf51b53c6a16f5" exitCode=0 Dec 09 12:07:29 crc kubenswrapper[4745]: I1209 12:07:29.731578 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpvcx" event={"ID":"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac","Type":"ContainerDied","Data":"137e0b4533f19e43671c6d2086e4e3d77d7e6df9ee8dff175bdf51b53c6a16f5"} Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.371734 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.406116 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-utilities\") pod \"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac\" (UID: \"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac\") " Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.406184 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-catalog-content\") pod \"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac\" (UID: \"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac\") " Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.406235 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55pm4\" (UniqueName: \"kubernetes.io/projected/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-kube-api-access-55pm4\") pod \"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac\" (UID: \"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac\") " Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.407110 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-utilities" (OuterVolumeSpecName: "utilities") pod "7c27c91b-fdb0-4c80-8541-cadbeb30a3ac" (UID: "7c27c91b-fdb0-4c80-8541-cadbeb30a3ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.411828 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-kube-api-access-55pm4" (OuterVolumeSpecName: "kube-api-access-55pm4") pod "7c27c91b-fdb0-4c80-8541-cadbeb30a3ac" (UID: "7c27c91b-fdb0-4c80-8541-cadbeb30a3ac"). InnerVolumeSpecName "kube-api-access-55pm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.461127 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c27c91b-fdb0-4c80-8541-cadbeb30a3ac" (UID: "7c27c91b-fdb0-4c80-8541-cadbeb30a3ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.508156 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.508195 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55pm4\" (UniqueName: \"kubernetes.io/projected/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-kube-api-access-55pm4\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.508210 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.743893 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpvcx" event={"ID":"7c27c91b-fdb0-4c80-8541-cadbeb30a3ac","Type":"ContainerDied","Data":"2ce73ceedefe2ec2b6cd323b15f4f01ad370c56479f278a61e624ef53a9c9d29"} Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.743983 4745 scope.go:117] "RemoveContainer" containerID="137e0b4533f19e43671c6d2086e4e3d77d7e6df9ee8dff175bdf51b53c6a16f5" Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.743993 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpvcx" Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.775296 4745 scope.go:117] "RemoveContainer" containerID="f2db5ac708fd49943b0a9e2a37f595625640832fde56fa32d86ee5bd46617699" Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.794938 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpvcx"] Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.803191 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hpvcx"] Dec 09 12:07:30 crc kubenswrapper[4745]: I1209 12:07:30.805768 4745 scope.go:117] "RemoveContainer" containerID="ad8be0d4f38343f0c806da660b852b2fda8d42d8492e2c768ef96d694af0f878" Dec 09 12:07:31 crc kubenswrapper[4745]: I1209 12:07:31.566489 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c27c91b-fdb0-4c80-8541-cadbeb30a3ac" path="/var/lib/kubelet/pods/7c27c91b-fdb0-4c80-8541-cadbeb30a3ac/volumes" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.019703 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c797c"] Dec 09 12:07:54 crc kubenswrapper[4745]: E1209 12:07:54.020508 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c27c91b-fdb0-4c80-8541-cadbeb30a3ac" containerName="extract-utilities" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.020536 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c27c91b-fdb0-4c80-8541-cadbeb30a3ac" containerName="extract-utilities" Dec 09 12:07:54 crc kubenswrapper[4745]: E1209 12:07:54.020546 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c27c91b-fdb0-4c80-8541-cadbeb30a3ac" containerName="registry-server" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.020555 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c27c91b-fdb0-4c80-8541-cadbeb30a3ac" containerName="registry-server" Dec 09 12:07:54 crc kubenswrapper[4745]: E1209 12:07:54.020581 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c27c91b-fdb0-4c80-8541-cadbeb30a3ac" containerName="extract-content" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.020587 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c27c91b-fdb0-4c80-8541-cadbeb30a3ac" containerName="extract-content" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.020734 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c27c91b-fdb0-4c80-8541-cadbeb30a3ac" containerName="registry-server" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.021891 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c797c" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.032364 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c797c"] Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.093193 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj2jz\" (UniqueName: \"kubernetes.io/projected/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-kube-api-access-dj2jz\") pod \"community-operators-c797c\" (UID: \"fa8f7849-ff64-4fc6-9df8-8d13dc911b54\") " pod="openshift-marketplace/community-operators-c797c" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.093244 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-catalog-content\") pod \"community-operators-c797c\" (UID: \"fa8f7849-ff64-4fc6-9df8-8d13dc911b54\") " pod="openshift-marketplace/community-operators-c797c" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.093310 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-utilities\") pod \"community-operators-c797c\" (UID: \"fa8f7849-ff64-4fc6-9df8-8d13dc911b54\") " pod="openshift-marketplace/community-operators-c797c" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.194417 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-utilities\") pod \"community-operators-c797c\" (UID: \"fa8f7849-ff64-4fc6-9df8-8d13dc911b54\") " pod="openshift-marketplace/community-operators-c797c" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.194577 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj2jz\" (UniqueName: \"kubernetes.io/projected/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-kube-api-access-dj2jz\") pod \"community-operators-c797c\" (UID: \"fa8f7849-ff64-4fc6-9df8-8d13dc911b54\") " pod="openshift-marketplace/community-operators-c797c" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.194603 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-catalog-content\") pod \"community-operators-c797c\" (UID: \"fa8f7849-ff64-4fc6-9df8-8d13dc911b54\") " pod="openshift-marketplace/community-operators-c797c" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.194949 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-utilities\") pod \"community-operators-c797c\" (UID: \"fa8f7849-ff64-4fc6-9df8-8d13dc911b54\") " pod="openshift-marketplace/community-operators-c797c" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.195087 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-catalog-content\") pod \"community-operators-c797c\" (UID: \"fa8f7849-ff64-4fc6-9df8-8d13dc911b54\") " pod="openshift-marketplace/community-operators-c797c" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.221119 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj2jz\" (UniqueName: \"kubernetes.io/projected/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-kube-api-access-dj2jz\") pod \"community-operators-c797c\" (UID: \"fa8f7849-ff64-4fc6-9df8-8d13dc911b54\") " pod="openshift-marketplace/community-operators-c797c" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.339370 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c797c" Dec 09 12:07:54 crc kubenswrapper[4745]: I1209 12:07:54.942236 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c797c"] Dec 09 12:07:55 crc kubenswrapper[4745]: I1209 12:07:55.475346 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:07:55 crc kubenswrapper[4745]: I1209 12:07:55.475700 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:07:55 crc kubenswrapper[4745]: I1209 12:07:55.955644 4745 generic.go:334] "Generic (PLEG): container finished" podID="fa8f7849-ff64-4fc6-9df8-8d13dc911b54" containerID="4649cd34344358b926058c94fab755202c40c77269ae3dc36fc4d0d66abcfc9f" exitCode=0 Dec 09 12:07:55 crc kubenswrapper[4745]: I1209 12:07:55.955953 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c797c" event={"ID":"fa8f7849-ff64-4fc6-9df8-8d13dc911b54","Type":"ContainerDied","Data":"4649cd34344358b926058c94fab755202c40c77269ae3dc36fc4d0d66abcfc9f"} Dec 09 12:07:55 crc kubenswrapper[4745]: I1209 12:07:55.955992 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c797c" event={"ID":"fa8f7849-ff64-4fc6-9df8-8d13dc911b54","Type":"ContainerStarted","Data":"9890198bb6295466a75f3d7db4de8f8732d17a03c8da5afee18c9f641a7daf6e"} Dec 09 12:07:55 crc kubenswrapper[4745]: I1209 12:07:55.958666 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:07:56 crc kubenswrapper[4745]: I1209 12:07:56.965212 4745 generic.go:334] "Generic (PLEG): container finished" podID="fa8f7849-ff64-4fc6-9df8-8d13dc911b54" containerID="7dc5d220e87956a4f789611fc0d01fd8663c7d90b5b389f36b28be4ef1e72d5e" exitCode=0 Dec 09 12:07:56 crc kubenswrapper[4745]: I1209 12:07:56.965267 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c797c" event={"ID":"fa8f7849-ff64-4fc6-9df8-8d13dc911b54","Type":"ContainerDied","Data":"7dc5d220e87956a4f789611fc0d01fd8663c7d90b5b389f36b28be4ef1e72d5e"} Dec 09 12:07:57 crc kubenswrapper[4745]: I1209 12:07:57.976002 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c797c" event={"ID":"fa8f7849-ff64-4fc6-9df8-8d13dc911b54","Type":"ContainerStarted","Data":"46e2a7d7fd69eeb31c21b280dc711c50a7cfd8b22a638b9b13d56832ce3dd97e"} Dec 09 12:07:58 crc kubenswrapper[4745]: I1209 12:07:58.007406 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c797c" podStartSLOduration=3.579584517 podStartE2EDuration="5.007375168s" podCreationTimestamp="2025-12-09 12:07:53 +0000 UTC" firstStartedPulling="2025-12-09 12:07:55.958294949 +0000 UTC m=+2162.783496483" lastFinishedPulling="2025-12-09 12:07:57.38608559 +0000 UTC m=+2164.211287134" observedRunningTime="2025-12-09 12:07:58.001407367 +0000 UTC m=+2164.826608911" watchObservedRunningTime="2025-12-09 12:07:58.007375168 +0000 UTC m=+2164.832576702" Dec 09 12:08:04 crc kubenswrapper[4745]: I1209 12:08:04.340580 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c797c" Dec 09 12:08:04 crc kubenswrapper[4745]: I1209 12:08:04.341205 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c797c" Dec 09 12:08:04 crc kubenswrapper[4745]: I1209 12:08:04.393476 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c797c" Dec 09 12:08:05 crc kubenswrapper[4745]: I1209 12:08:05.079041 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c797c" Dec 09 12:08:05 crc kubenswrapper[4745]: I1209 12:08:05.141229 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c797c"] Dec 09 12:08:07 crc kubenswrapper[4745]: I1209 12:08:07.050593 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c797c" podUID="fa8f7849-ff64-4fc6-9df8-8d13dc911b54" containerName="registry-server" containerID="cri-o://46e2a7d7fd69eeb31c21b280dc711c50a7cfd8b22a638b9b13d56832ce3dd97e" gracePeriod=2 Dec 09 12:08:08 crc kubenswrapper[4745]: I1209 12:08:08.061563 4745 generic.go:334] "Generic (PLEG): container finished" podID="fa8f7849-ff64-4fc6-9df8-8d13dc911b54" containerID="46e2a7d7fd69eeb31c21b280dc711c50a7cfd8b22a638b9b13d56832ce3dd97e" exitCode=0 Dec 09 12:08:08 crc kubenswrapper[4745]: I1209 12:08:08.061653 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c797c" event={"ID":"fa8f7849-ff64-4fc6-9df8-8d13dc911b54","Type":"ContainerDied","Data":"46e2a7d7fd69eeb31c21b280dc711c50a7cfd8b22a638b9b13d56832ce3dd97e"} Dec 09 12:08:08 crc kubenswrapper[4745]: I1209 12:08:08.534857 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c797c" Dec 09 12:08:08 crc kubenswrapper[4745]: I1209 12:08:08.726403 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-utilities\") pod \"fa8f7849-ff64-4fc6-9df8-8d13dc911b54\" (UID: \"fa8f7849-ff64-4fc6-9df8-8d13dc911b54\") " Dec 09 12:08:08 crc kubenswrapper[4745]: I1209 12:08:08.726490 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-catalog-content\") pod \"fa8f7849-ff64-4fc6-9df8-8d13dc911b54\" (UID: \"fa8f7849-ff64-4fc6-9df8-8d13dc911b54\") " Dec 09 12:08:08 crc kubenswrapper[4745]: I1209 12:08:08.726637 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj2jz\" (UniqueName: \"kubernetes.io/projected/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-kube-api-access-dj2jz\") pod \"fa8f7849-ff64-4fc6-9df8-8d13dc911b54\" (UID: \"fa8f7849-ff64-4fc6-9df8-8d13dc911b54\") " Dec 09 12:08:08 crc kubenswrapper[4745]: I1209 12:08:08.727366 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-utilities" (OuterVolumeSpecName: "utilities") pod "fa8f7849-ff64-4fc6-9df8-8d13dc911b54" (UID: "fa8f7849-ff64-4fc6-9df8-8d13dc911b54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:08:08 crc kubenswrapper[4745]: I1209 12:08:08.734022 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-kube-api-access-dj2jz" (OuterVolumeSpecName: "kube-api-access-dj2jz") pod "fa8f7849-ff64-4fc6-9df8-8d13dc911b54" (UID: "fa8f7849-ff64-4fc6-9df8-8d13dc911b54"). InnerVolumeSpecName "kube-api-access-dj2jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:08:08 crc kubenswrapper[4745]: I1209 12:08:08.790966 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa8f7849-ff64-4fc6-9df8-8d13dc911b54" (UID: "fa8f7849-ff64-4fc6-9df8-8d13dc911b54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:08:08 crc kubenswrapper[4745]: I1209 12:08:08.829442 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:08 crc kubenswrapper[4745]: I1209 12:08:08.829480 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj2jz\" (UniqueName: \"kubernetes.io/projected/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-kube-api-access-dj2jz\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:08 crc kubenswrapper[4745]: I1209 12:08:08.829492 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa8f7849-ff64-4fc6-9df8-8d13dc911b54-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:08:09 crc kubenswrapper[4745]: I1209 12:08:09.073672 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c797c" event={"ID":"fa8f7849-ff64-4fc6-9df8-8d13dc911b54","Type":"ContainerDied","Data":"9890198bb6295466a75f3d7db4de8f8732d17a03c8da5afee18c9f641a7daf6e"} Dec 09 12:08:09 crc kubenswrapper[4745]: I1209 12:08:09.073752 4745 scope.go:117] "RemoveContainer" containerID="46e2a7d7fd69eeb31c21b280dc711c50a7cfd8b22a638b9b13d56832ce3dd97e" Dec 09 12:08:09 crc kubenswrapper[4745]: I1209 12:08:09.073756 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c797c" Dec 09 12:08:09 crc kubenswrapper[4745]: I1209 12:08:09.109458 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c797c"] Dec 09 12:08:09 crc kubenswrapper[4745]: I1209 12:08:09.113201 4745 scope.go:117] "RemoveContainer" containerID="7dc5d220e87956a4f789611fc0d01fd8663c7d90b5b389f36b28be4ef1e72d5e" Dec 09 12:08:09 crc kubenswrapper[4745]: I1209 12:08:09.116311 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c797c"] Dec 09 12:08:09 crc kubenswrapper[4745]: I1209 12:08:09.133314 4745 scope.go:117] "RemoveContainer" containerID="4649cd34344358b926058c94fab755202c40c77269ae3dc36fc4d0d66abcfc9f" Dec 09 12:08:09 crc kubenswrapper[4745]: I1209 12:08:09.568613 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa8f7849-ff64-4fc6-9df8-8d13dc911b54" path="/var/lib/kubelet/pods/fa8f7849-ff64-4fc6-9df8-8d13dc911b54/volumes" Dec 09 12:08:25 crc kubenswrapper[4745]: I1209 12:08:25.475935 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:08:25 crc kubenswrapper[4745]: I1209 12:08:25.476430 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:08:55 crc kubenswrapper[4745]: I1209 12:08:55.475057 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:08:55 crc kubenswrapper[4745]: I1209 12:08:55.475697 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:08:55 crc kubenswrapper[4745]: I1209 12:08:55.475759 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 12:08:55 crc kubenswrapper[4745]: I1209 12:08:55.476556 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:08:55 crc kubenswrapper[4745]: I1209 12:08:55.476617 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" gracePeriod=600 Dec 09 12:08:55 crc kubenswrapper[4745]: E1209 12:08:55.662722 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:08:56 crc kubenswrapper[4745]: I1209 12:08:56.541259 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" exitCode=0 Dec 09 12:08:56 crc kubenswrapper[4745]: I1209 12:08:56.541322 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc"} Dec 09 12:08:56 crc kubenswrapper[4745]: I1209 12:08:56.541394 4745 scope.go:117] "RemoveContainer" containerID="80ec805e25be5f9e7a50691b3bc366c0c67b3268413cc1b2a87045ce55eecc37" Dec 09 12:08:56 crc kubenswrapper[4745]: I1209 12:08:56.542091 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:08:56 crc kubenswrapper[4745]: E1209 12:08:56.542525 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:09:11 crc kubenswrapper[4745]: I1209 12:09:11.555109 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:09:11 crc kubenswrapper[4745]: E1209 12:09:11.555998 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:09:23 crc kubenswrapper[4745]: I1209 12:09:23.559355 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:09:23 crc kubenswrapper[4745]: E1209 12:09:23.560146 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:09:36 crc kubenswrapper[4745]: I1209 12:09:36.555302 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:09:36 crc kubenswrapper[4745]: E1209 12:09:36.556057 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:09:49 crc kubenswrapper[4745]: I1209 12:09:49.555547 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:09:49 crc kubenswrapper[4745]: E1209 12:09:49.556397 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:10:00 crc kubenswrapper[4745]: I1209 12:10:00.555678 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:10:00 crc kubenswrapper[4745]: E1209 12:10:00.558305 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:10:12 crc kubenswrapper[4745]: I1209 12:10:12.554906 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:10:12 crc kubenswrapper[4745]: E1209 12:10:12.555983 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:10:25 crc kubenswrapper[4745]: I1209 12:10:25.555369 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:10:25 crc kubenswrapper[4745]: E1209 12:10:25.556708 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:10:38 crc kubenswrapper[4745]: I1209 12:10:38.555116 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:10:38 crc kubenswrapper[4745]: E1209 12:10:38.556239 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:10:51 crc kubenswrapper[4745]: I1209 12:10:51.556362 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:10:51 crc kubenswrapper[4745]: E1209 12:10:51.558040 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:11:06 crc kubenswrapper[4745]: I1209 12:11:06.555025 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:11:06 crc kubenswrapper[4745]: E1209 12:11:06.556132 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:11:17 crc kubenswrapper[4745]: I1209 12:11:17.555052 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:11:17 crc kubenswrapper[4745]: E1209 12:11:17.556546 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:11:29 crc kubenswrapper[4745]: I1209 12:11:29.555560 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:11:29 crc kubenswrapper[4745]: E1209 12:11:29.556590 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:11:43 crc kubenswrapper[4745]: I1209 12:11:43.560350 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:11:43 crc kubenswrapper[4745]: E1209 12:11:43.561215 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:11:58 crc kubenswrapper[4745]: I1209 12:11:58.555132 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:11:58 crc kubenswrapper[4745]: E1209 12:11:58.555707 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:12:09 crc kubenswrapper[4745]: I1209 12:12:09.555337 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:12:09 crc kubenswrapper[4745]: E1209 12:12:09.556132 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:12:22 crc kubenswrapper[4745]: I1209 12:12:22.555366 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:12:22 crc kubenswrapper[4745]: E1209 12:12:22.557573 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:12:34 crc kubenswrapper[4745]: I1209 12:12:34.555741 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:12:34 crc kubenswrapper[4745]: E1209 12:12:34.556542 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:12:46 crc kubenswrapper[4745]: I1209 12:12:46.555242 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:12:46 crc kubenswrapper[4745]: E1209 12:12:46.556592 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:12:57 crc kubenswrapper[4745]: I1209 12:12:57.555160 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:12:57 crc kubenswrapper[4745]: E1209 12:12:57.555879 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:13:11 crc kubenswrapper[4745]: I1209 12:13:11.555879 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:13:11 crc kubenswrapper[4745]: E1209 12:13:11.556768 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:13:26 crc kubenswrapper[4745]: I1209 12:13:26.555358 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:13:26 crc kubenswrapper[4745]: E1209 12:13:26.556186 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.315372 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bjlzl"] Dec 09 12:13:38 crc kubenswrapper[4745]: E1209 12:13:38.316826 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8f7849-ff64-4fc6-9df8-8d13dc911b54" containerName="extract-utilities" Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.316867 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8f7849-ff64-4fc6-9df8-8d13dc911b54" containerName="extract-utilities" Dec 09 12:13:38 crc kubenswrapper[4745]: E1209 12:13:38.316889 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8f7849-ff64-4fc6-9df8-8d13dc911b54" containerName="registry-server" Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.316901 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8f7849-ff64-4fc6-9df8-8d13dc911b54" containerName="registry-server" Dec 09 12:13:38 crc kubenswrapper[4745]: E1209 12:13:38.316921 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8f7849-ff64-4fc6-9df8-8d13dc911b54" containerName="extract-content" Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.316932 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8f7849-ff64-4fc6-9df8-8d13dc911b54" containerName="extract-content" Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.317264 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8f7849-ff64-4fc6-9df8-8d13dc911b54" containerName="registry-server" Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.320989 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.334377 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjlzl"] Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.458361 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580746f0-c403-4c6c-bb79-1596568a890e-utilities\") pod \"redhat-marketplace-bjlzl\" (UID: \"580746f0-c403-4c6c-bb79-1596568a890e\") " pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.458417 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5jlc\" (UniqueName: \"kubernetes.io/projected/580746f0-c403-4c6c-bb79-1596568a890e-kube-api-access-w5jlc\") pod \"redhat-marketplace-bjlzl\" (UID: \"580746f0-c403-4c6c-bb79-1596568a890e\") " pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.458549 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580746f0-c403-4c6c-bb79-1596568a890e-catalog-content\") pod \"redhat-marketplace-bjlzl\" (UID: \"580746f0-c403-4c6c-bb79-1596568a890e\") " pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.560041 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580746f0-c403-4c6c-bb79-1596568a890e-catalog-content\") pod \"redhat-marketplace-bjlzl\" (UID: \"580746f0-c403-4c6c-bb79-1596568a890e\") " pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.560142 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580746f0-c403-4c6c-bb79-1596568a890e-utilities\") pod \"redhat-marketplace-bjlzl\" (UID: \"580746f0-c403-4c6c-bb79-1596568a890e\") " pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.560165 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5jlc\" (UniqueName: \"kubernetes.io/projected/580746f0-c403-4c6c-bb79-1596568a890e-kube-api-access-w5jlc\") pod \"redhat-marketplace-bjlzl\" (UID: \"580746f0-c403-4c6c-bb79-1596568a890e\") " pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.560549 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580746f0-c403-4c6c-bb79-1596568a890e-catalog-content\") pod \"redhat-marketplace-bjlzl\" (UID: \"580746f0-c403-4c6c-bb79-1596568a890e\") " pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.560606 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580746f0-c403-4c6c-bb79-1596568a890e-utilities\") pod \"redhat-marketplace-bjlzl\" (UID: \"580746f0-c403-4c6c-bb79-1596568a890e\") " pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.580567 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5jlc\" (UniqueName: \"kubernetes.io/projected/580746f0-c403-4c6c-bb79-1596568a890e-kube-api-access-w5jlc\") pod \"redhat-marketplace-bjlzl\" (UID: \"580746f0-c403-4c6c-bb79-1596568a890e\") " pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:38 crc kubenswrapper[4745]: I1209 12:13:38.641832 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:39 crc kubenswrapper[4745]: I1209 12:13:39.146436 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjlzl"] Dec 09 12:13:39 crc kubenswrapper[4745]: I1209 12:13:39.157105 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjlzl" event={"ID":"580746f0-c403-4c6c-bb79-1596568a890e","Type":"ContainerStarted","Data":"c3bf432e273fa99270669d60057da9468a0955857a25d2dd37bda51f8ae74678"} Dec 09 12:13:40 crc kubenswrapper[4745]: I1209 12:13:40.168886 4745 generic.go:334] "Generic (PLEG): container finished" podID="580746f0-c403-4c6c-bb79-1596568a890e" containerID="0d8ce6069abdd76d609403554223b1251ae3589bc9c0a88d11a3bebd29aad40a" exitCode=0 Dec 09 12:13:40 crc kubenswrapper[4745]: I1209 12:13:40.168935 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjlzl" event={"ID":"580746f0-c403-4c6c-bb79-1596568a890e","Type":"ContainerDied","Data":"0d8ce6069abdd76d609403554223b1251ae3589bc9c0a88d11a3bebd29aad40a"} Dec 09 12:13:40 crc kubenswrapper[4745]: I1209 12:13:40.171718 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:13:41 crc kubenswrapper[4745]: I1209 12:13:41.556060 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:13:41 crc kubenswrapper[4745]: E1209 12:13:41.557028 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:13:43 crc kubenswrapper[4745]: I1209 12:13:43.194360 4745 generic.go:334] "Generic (PLEG): container finished" podID="580746f0-c403-4c6c-bb79-1596568a890e" containerID="653db755abd4360992487386b9758e039f65240a2c6702ecd72021a0bf2c1b97" exitCode=0 Dec 09 12:13:43 crc kubenswrapper[4745]: I1209 12:13:43.194418 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjlzl" event={"ID":"580746f0-c403-4c6c-bb79-1596568a890e","Type":"ContainerDied","Data":"653db755abd4360992487386b9758e039f65240a2c6702ecd72021a0bf2c1b97"} Dec 09 12:13:44 crc kubenswrapper[4745]: I1209 12:13:44.205189 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjlzl" event={"ID":"580746f0-c403-4c6c-bb79-1596568a890e","Type":"ContainerStarted","Data":"a45d47f349136a3e5fba94f4e3df478afcf017cee8cf519ab7209f20d13dec06"} Dec 09 12:13:44 crc kubenswrapper[4745]: I1209 12:13:44.227363 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bjlzl" podStartSLOduration=2.723883367 podStartE2EDuration="6.227337779s" podCreationTimestamp="2025-12-09 12:13:38 +0000 UTC" firstStartedPulling="2025-12-09 12:13:40.171420839 +0000 UTC m=+2506.996622363" lastFinishedPulling="2025-12-09 12:13:43.674875251 +0000 UTC m=+2510.500076775" observedRunningTime="2025-12-09 12:13:44.226755453 +0000 UTC m=+2511.051956977" watchObservedRunningTime="2025-12-09 12:13:44.227337779 +0000 UTC m=+2511.052539303" Dec 09 12:13:48 crc kubenswrapper[4745]: I1209 12:13:48.642023 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:48 crc kubenswrapper[4745]: I1209 12:13:48.643692 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:48 crc kubenswrapper[4745]: I1209 12:13:48.710415 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:49 crc kubenswrapper[4745]: I1209 12:13:49.286784 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:49 crc kubenswrapper[4745]: I1209 12:13:49.338838 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjlzl"] Dec 09 12:13:51 crc kubenswrapper[4745]: I1209 12:13:51.260485 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bjlzl" podUID="580746f0-c403-4c6c-bb79-1596568a890e" containerName="registry-server" containerID="cri-o://a45d47f349136a3e5fba94f4e3df478afcf017cee8cf519ab7209f20d13dec06" gracePeriod=2 Dec 09 12:13:51 crc kubenswrapper[4745]: I1209 12:13:51.648330 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:51 crc kubenswrapper[4745]: I1209 12:13:51.784062 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5jlc\" (UniqueName: \"kubernetes.io/projected/580746f0-c403-4c6c-bb79-1596568a890e-kube-api-access-w5jlc\") pod \"580746f0-c403-4c6c-bb79-1596568a890e\" (UID: \"580746f0-c403-4c6c-bb79-1596568a890e\") " Dec 09 12:13:51 crc kubenswrapper[4745]: I1209 12:13:51.784211 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580746f0-c403-4c6c-bb79-1596568a890e-catalog-content\") pod \"580746f0-c403-4c6c-bb79-1596568a890e\" (UID: \"580746f0-c403-4c6c-bb79-1596568a890e\") " Dec 09 12:13:51 crc kubenswrapper[4745]: I1209 12:13:51.784318 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580746f0-c403-4c6c-bb79-1596568a890e-utilities\") pod \"580746f0-c403-4c6c-bb79-1596568a890e\" (UID: \"580746f0-c403-4c6c-bb79-1596568a890e\") " Dec 09 12:13:51 crc kubenswrapper[4745]: I1209 12:13:51.785480 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/580746f0-c403-4c6c-bb79-1596568a890e-utilities" (OuterVolumeSpecName: "utilities") pod "580746f0-c403-4c6c-bb79-1596568a890e" (UID: "580746f0-c403-4c6c-bb79-1596568a890e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:13:51 crc kubenswrapper[4745]: I1209 12:13:51.793838 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/580746f0-c403-4c6c-bb79-1596568a890e-kube-api-access-w5jlc" (OuterVolumeSpecName: "kube-api-access-w5jlc") pod "580746f0-c403-4c6c-bb79-1596568a890e" (UID: "580746f0-c403-4c6c-bb79-1596568a890e"). InnerVolumeSpecName "kube-api-access-w5jlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:13:51 crc kubenswrapper[4745]: I1209 12:13:51.808598 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/580746f0-c403-4c6c-bb79-1596568a890e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "580746f0-c403-4c6c-bb79-1596568a890e" (UID: "580746f0-c403-4c6c-bb79-1596568a890e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:13:51 crc kubenswrapper[4745]: I1209 12:13:51.886165 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580746f0-c403-4c6c-bb79-1596568a890e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:13:51 crc kubenswrapper[4745]: I1209 12:13:51.886222 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580746f0-c403-4c6c-bb79-1596568a890e-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:13:51 crc kubenswrapper[4745]: I1209 12:13:51.886236 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5jlc\" (UniqueName: \"kubernetes.io/projected/580746f0-c403-4c6c-bb79-1596568a890e-kube-api-access-w5jlc\") on node \"crc\" DevicePath \"\"" Dec 09 12:13:52 crc kubenswrapper[4745]: I1209 12:13:52.267802 4745 generic.go:334] "Generic (PLEG): container finished" podID="580746f0-c403-4c6c-bb79-1596568a890e" containerID="a45d47f349136a3e5fba94f4e3df478afcf017cee8cf519ab7209f20d13dec06" exitCode=0 Dec 09 12:13:52 crc kubenswrapper[4745]: I1209 12:13:52.267864 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjlzl" event={"ID":"580746f0-c403-4c6c-bb79-1596568a890e","Type":"ContainerDied","Data":"a45d47f349136a3e5fba94f4e3df478afcf017cee8cf519ab7209f20d13dec06"} Dec 09 12:13:52 crc kubenswrapper[4745]: I1209 12:13:52.267910 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjlzl" event={"ID":"580746f0-c403-4c6c-bb79-1596568a890e","Type":"ContainerDied","Data":"c3bf432e273fa99270669d60057da9468a0955857a25d2dd37bda51f8ae74678"} Dec 09 12:13:52 crc kubenswrapper[4745]: I1209 12:13:52.267913 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjlzl" Dec 09 12:13:52 crc kubenswrapper[4745]: I1209 12:13:52.267936 4745 scope.go:117] "RemoveContainer" containerID="a45d47f349136a3e5fba94f4e3df478afcf017cee8cf519ab7209f20d13dec06" Dec 09 12:13:52 crc kubenswrapper[4745]: I1209 12:13:52.288056 4745 scope.go:117] "RemoveContainer" containerID="653db755abd4360992487386b9758e039f65240a2c6702ecd72021a0bf2c1b97" Dec 09 12:13:52 crc kubenswrapper[4745]: I1209 12:13:52.303880 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjlzl"] Dec 09 12:13:52 crc kubenswrapper[4745]: I1209 12:13:52.310936 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjlzl"] Dec 09 12:13:52 crc kubenswrapper[4745]: I1209 12:13:52.326964 4745 scope.go:117] "RemoveContainer" containerID="0d8ce6069abdd76d609403554223b1251ae3589bc9c0a88d11a3bebd29aad40a" Dec 09 12:13:52 crc kubenswrapper[4745]: I1209 12:13:52.355441 4745 scope.go:117] "RemoveContainer" containerID="a45d47f349136a3e5fba94f4e3df478afcf017cee8cf519ab7209f20d13dec06" Dec 09 12:13:52 crc kubenswrapper[4745]: E1209 12:13:52.356223 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45d47f349136a3e5fba94f4e3df478afcf017cee8cf519ab7209f20d13dec06\": container with ID starting with a45d47f349136a3e5fba94f4e3df478afcf017cee8cf519ab7209f20d13dec06 not found: ID does not exist" containerID="a45d47f349136a3e5fba94f4e3df478afcf017cee8cf519ab7209f20d13dec06" Dec 09 12:13:52 crc kubenswrapper[4745]: I1209 12:13:52.356265 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45d47f349136a3e5fba94f4e3df478afcf017cee8cf519ab7209f20d13dec06"} err="failed to get container status \"a45d47f349136a3e5fba94f4e3df478afcf017cee8cf519ab7209f20d13dec06\": rpc error: code = NotFound desc = could not find container \"a45d47f349136a3e5fba94f4e3df478afcf017cee8cf519ab7209f20d13dec06\": container with ID starting with a45d47f349136a3e5fba94f4e3df478afcf017cee8cf519ab7209f20d13dec06 not found: ID does not exist" Dec 09 12:13:52 crc kubenswrapper[4745]: I1209 12:13:52.356294 4745 scope.go:117] "RemoveContainer" containerID="653db755abd4360992487386b9758e039f65240a2c6702ecd72021a0bf2c1b97" Dec 09 12:13:52 crc kubenswrapper[4745]: E1209 12:13:52.356668 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"653db755abd4360992487386b9758e039f65240a2c6702ecd72021a0bf2c1b97\": container with ID starting with 653db755abd4360992487386b9758e039f65240a2c6702ecd72021a0bf2c1b97 not found: ID does not exist" containerID="653db755abd4360992487386b9758e039f65240a2c6702ecd72021a0bf2c1b97" Dec 09 12:13:52 crc kubenswrapper[4745]: I1209 12:13:52.356696 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"653db755abd4360992487386b9758e039f65240a2c6702ecd72021a0bf2c1b97"} err="failed to get container status \"653db755abd4360992487386b9758e039f65240a2c6702ecd72021a0bf2c1b97\": rpc error: code = NotFound desc = could not find container \"653db755abd4360992487386b9758e039f65240a2c6702ecd72021a0bf2c1b97\": container with ID starting with 653db755abd4360992487386b9758e039f65240a2c6702ecd72021a0bf2c1b97 not found: ID does not exist" Dec 09 12:13:52 crc kubenswrapper[4745]: I1209 12:13:52.356717 4745 scope.go:117] "RemoveContainer" containerID="0d8ce6069abdd76d609403554223b1251ae3589bc9c0a88d11a3bebd29aad40a" Dec 09 12:13:52 crc kubenswrapper[4745]: E1209 12:13:52.356961 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d8ce6069abdd76d609403554223b1251ae3589bc9c0a88d11a3bebd29aad40a\": container with ID starting with 0d8ce6069abdd76d609403554223b1251ae3589bc9c0a88d11a3bebd29aad40a not found: ID does not exist" containerID="0d8ce6069abdd76d609403554223b1251ae3589bc9c0a88d11a3bebd29aad40a" Dec 09 12:13:52 crc kubenswrapper[4745]: I1209 12:13:52.357022 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8ce6069abdd76d609403554223b1251ae3589bc9c0a88d11a3bebd29aad40a"} err="failed to get container status \"0d8ce6069abdd76d609403554223b1251ae3589bc9c0a88d11a3bebd29aad40a\": rpc error: code = NotFound desc = could not find container \"0d8ce6069abdd76d609403554223b1251ae3589bc9c0a88d11a3bebd29aad40a\": container with ID starting with 0d8ce6069abdd76d609403554223b1251ae3589bc9c0a88d11a3bebd29aad40a not found: ID does not exist" Dec 09 12:13:53 crc kubenswrapper[4745]: I1209 12:13:53.566229 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="580746f0-c403-4c6c-bb79-1596568a890e" path="/var/lib/kubelet/pods/580746f0-c403-4c6c-bb79-1596568a890e/volumes" Dec 09 12:13:54 crc kubenswrapper[4745]: I1209 12:13:54.555343 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:13:54 crc kubenswrapper[4745]: E1209 12:13:54.555887 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:14:08 crc kubenswrapper[4745]: I1209 12:14:08.555403 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:14:09 crc kubenswrapper[4745]: I1209 12:14:09.404084 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"dbd850053b37cf6f0557953af263789c8beb1c8d932faa93d078bdd932ebe4f1"} Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.159980 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw"] Dec 09 12:15:00 crc kubenswrapper[4745]: E1209 12:15:00.161031 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580746f0-c403-4c6c-bb79-1596568a890e" containerName="registry-server" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.161051 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="580746f0-c403-4c6c-bb79-1596568a890e" containerName="registry-server" Dec 09 12:15:00 crc kubenswrapper[4745]: E1209 12:15:00.161067 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580746f0-c403-4c6c-bb79-1596568a890e" containerName="extract-content" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.161075 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="580746f0-c403-4c6c-bb79-1596568a890e" containerName="extract-content" Dec 09 12:15:00 crc kubenswrapper[4745]: E1209 12:15:00.161090 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580746f0-c403-4c6c-bb79-1596568a890e" containerName="extract-utilities" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.161099 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="580746f0-c403-4c6c-bb79-1596568a890e" containerName="extract-utilities" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.161293 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="580746f0-c403-4c6c-bb79-1596568a890e" containerName="registry-server" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.162022 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.166108 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.166126 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.168729 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw"] Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.188742 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gccvk\" (UniqueName: \"kubernetes.io/projected/a423627e-0179-440f-a680-eaee72bdc514-kube-api-access-gccvk\") pod \"collect-profiles-29421375-w6jdw\" (UID: \"a423627e-0179-440f-a680-eaee72bdc514\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.188807 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a423627e-0179-440f-a680-eaee72bdc514-secret-volume\") pod \"collect-profiles-29421375-w6jdw\" (UID: \"a423627e-0179-440f-a680-eaee72bdc514\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.188864 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a423627e-0179-440f-a680-eaee72bdc514-config-volume\") pod \"collect-profiles-29421375-w6jdw\" (UID: \"a423627e-0179-440f-a680-eaee72bdc514\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.291164 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a423627e-0179-440f-a680-eaee72bdc514-config-volume\") pod \"collect-profiles-29421375-w6jdw\" (UID: \"a423627e-0179-440f-a680-eaee72bdc514\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.291335 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gccvk\" (UniqueName: \"kubernetes.io/projected/a423627e-0179-440f-a680-eaee72bdc514-kube-api-access-gccvk\") pod \"collect-profiles-29421375-w6jdw\" (UID: \"a423627e-0179-440f-a680-eaee72bdc514\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.291382 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a423627e-0179-440f-a680-eaee72bdc514-secret-volume\") pod \"collect-profiles-29421375-w6jdw\" (UID: \"a423627e-0179-440f-a680-eaee72bdc514\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.292854 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a423627e-0179-440f-a680-eaee72bdc514-config-volume\") pod \"collect-profiles-29421375-w6jdw\" (UID: \"a423627e-0179-440f-a680-eaee72bdc514\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.300953 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a423627e-0179-440f-a680-eaee72bdc514-secret-volume\") pod \"collect-profiles-29421375-w6jdw\" (UID: \"a423627e-0179-440f-a680-eaee72bdc514\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.311504 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gccvk\" (UniqueName: \"kubernetes.io/projected/a423627e-0179-440f-a680-eaee72bdc514-kube-api-access-gccvk\") pod \"collect-profiles-29421375-w6jdw\" (UID: \"a423627e-0179-440f-a680-eaee72bdc514\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.483727 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.912042 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw"] Dec 09 12:15:00 crc kubenswrapper[4745]: I1209 12:15:00.942826 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" event={"ID":"a423627e-0179-440f-a680-eaee72bdc514","Type":"ContainerStarted","Data":"bd2248ac62acd8ed7fdb0af0dc3dd57e01ec89e5137917929522c47c8f0b671e"} Dec 09 12:15:01 crc kubenswrapper[4745]: I1209 12:15:01.955544 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" event={"ID":"a423627e-0179-440f-a680-eaee72bdc514","Type":"ContainerStarted","Data":"9b79014c7124d9db17e37f09f0b1757a7df280c70480472354afb21289e7606f"} Dec 09 12:15:02 crc kubenswrapper[4745]: I1209 12:15:02.965869 4745 generic.go:334] "Generic (PLEG): container finished" podID="a423627e-0179-440f-a680-eaee72bdc514" containerID="9b79014c7124d9db17e37f09f0b1757a7df280c70480472354afb21289e7606f" exitCode=0 Dec 09 12:15:02 crc kubenswrapper[4745]: I1209 12:15:02.965955 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" event={"ID":"a423627e-0179-440f-a680-eaee72bdc514","Type":"ContainerDied","Data":"9b79014c7124d9db17e37f09f0b1757a7df280c70480472354afb21289e7606f"} Dec 09 12:15:03 crc kubenswrapper[4745]: I1209 12:15:03.263776 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" Dec 09 12:15:03 crc kubenswrapper[4745]: I1209 12:15:03.446600 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a423627e-0179-440f-a680-eaee72bdc514-secret-volume\") pod \"a423627e-0179-440f-a680-eaee72bdc514\" (UID: \"a423627e-0179-440f-a680-eaee72bdc514\") " Dec 09 12:15:03 crc kubenswrapper[4745]: I1209 12:15:03.446665 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a423627e-0179-440f-a680-eaee72bdc514-config-volume\") pod \"a423627e-0179-440f-a680-eaee72bdc514\" (UID: \"a423627e-0179-440f-a680-eaee72bdc514\") " Dec 09 12:15:03 crc kubenswrapper[4745]: I1209 12:15:03.446919 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gccvk\" (UniqueName: \"kubernetes.io/projected/a423627e-0179-440f-a680-eaee72bdc514-kube-api-access-gccvk\") pod \"a423627e-0179-440f-a680-eaee72bdc514\" (UID: \"a423627e-0179-440f-a680-eaee72bdc514\") " Dec 09 12:15:03 crc kubenswrapper[4745]: I1209 12:15:03.447648 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a423627e-0179-440f-a680-eaee72bdc514-config-volume" (OuterVolumeSpecName: "config-volume") pod "a423627e-0179-440f-a680-eaee72bdc514" (UID: "a423627e-0179-440f-a680-eaee72bdc514"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 12:15:03 crc kubenswrapper[4745]: I1209 12:15:03.454658 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a423627e-0179-440f-a680-eaee72bdc514-kube-api-access-gccvk" (OuterVolumeSpecName: "kube-api-access-gccvk") pod "a423627e-0179-440f-a680-eaee72bdc514" (UID: "a423627e-0179-440f-a680-eaee72bdc514"). InnerVolumeSpecName "kube-api-access-gccvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:15:03 crc kubenswrapper[4745]: I1209 12:15:03.456053 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a423627e-0179-440f-a680-eaee72bdc514-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a423627e-0179-440f-a680-eaee72bdc514" (UID: "a423627e-0179-440f-a680-eaee72bdc514"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:15:03 crc kubenswrapper[4745]: I1209 12:15:03.548569 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gccvk\" (UniqueName: \"kubernetes.io/projected/a423627e-0179-440f-a680-eaee72bdc514-kube-api-access-gccvk\") on node \"crc\" DevicePath \"\"" Dec 09 12:15:03 crc kubenswrapper[4745]: I1209 12:15:03.548886 4745 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a423627e-0179-440f-a680-eaee72bdc514-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 12:15:03 crc kubenswrapper[4745]: I1209 12:15:03.548949 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a423627e-0179-440f-a680-eaee72bdc514-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 12:15:03 crc kubenswrapper[4745]: I1209 12:15:03.985181 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" event={"ID":"a423627e-0179-440f-a680-eaee72bdc514","Type":"ContainerDied","Data":"bd2248ac62acd8ed7fdb0af0dc3dd57e01ec89e5137917929522c47c8f0b671e"} Dec 09 12:15:03 crc kubenswrapper[4745]: I1209 12:15:03.985723 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd2248ac62acd8ed7fdb0af0dc3dd57e01ec89e5137917929522c47c8f0b671e" Dec 09 12:15:03 crc kubenswrapper[4745]: I1209 12:15:03.985306 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw" Dec 09 12:15:04 crc kubenswrapper[4745]: I1209 12:15:04.347366 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb"] Dec 09 12:15:04 crc kubenswrapper[4745]: I1209 12:15:04.355160 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421330-9k5cb"] Dec 09 12:15:05 crc kubenswrapper[4745]: I1209 12:15:05.568107 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecd48ed5-6e8d-40f5-b3d1-1254df80a033" path="/var/lib/kubelet/pods/ecd48ed5-6e8d-40f5-b3d1-1254df80a033/volumes" Dec 09 12:15:54 crc kubenswrapper[4745]: I1209 12:15:54.763911 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2dz9r"] Dec 09 12:15:54 crc kubenswrapper[4745]: E1209 12:15:54.765114 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a423627e-0179-440f-a680-eaee72bdc514" containerName="collect-profiles" Dec 09 12:15:54 crc kubenswrapper[4745]: I1209 12:15:54.765135 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="a423627e-0179-440f-a680-eaee72bdc514" containerName="collect-profiles" Dec 09 12:15:54 crc kubenswrapper[4745]: I1209 12:15:54.765347 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="a423627e-0179-440f-a680-eaee72bdc514" containerName="collect-profiles" Dec 09 12:15:54 crc kubenswrapper[4745]: I1209 12:15:54.766805 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2dz9r" Dec 09 12:15:54 crc kubenswrapper[4745]: I1209 12:15:54.784342 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2dz9r"] Dec 09 12:15:54 crc kubenswrapper[4745]: I1209 12:15:54.878627 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p64r\" (UniqueName: \"kubernetes.io/projected/2f125ec9-331b-4128-85ce-9e03a7c28543-kube-api-access-6p64r\") pod \"redhat-operators-2dz9r\" (UID: \"2f125ec9-331b-4128-85ce-9e03a7c28543\") " pod="openshift-marketplace/redhat-operators-2dz9r" Dec 09 12:15:54 crc kubenswrapper[4745]: I1209 12:15:54.878683 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f125ec9-331b-4128-85ce-9e03a7c28543-utilities\") pod \"redhat-operators-2dz9r\" (UID: \"2f125ec9-331b-4128-85ce-9e03a7c28543\") " pod="openshift-marketplace/redhat-operators-2dz9r" Dec 09 12:15:54 crc kubenswrapper[4745]: I1209 12:15:54.878898 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f125ec9-331b-4128-85ce-9e03a7c28543-catalog-content\") pod \"redhat-operators-2dz9r\" (UID: \"2f125ec9-331b-4128-85ce-9e03a7c28543\") " pod="openshift-marketplace/redhat-operators-2dz9r" Dec 09 12:15:54 crc kubenswrapper[4745]: I1209 12:15:54.979497 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f125ec9-331b-4128-85ce-9e03a7c28543-catalog-content\") pod \"redhat-operators-2dz9r\" (UID: \"2f125ec9-331b-4128-85ce-9e03a7c28543\") " pod="openshift-marketplace/redhat-operators-2dz9r" Dec 09 12:15:54 crc kubenswrapper[4745]: I1209 12:15:54.979588 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p64r\" (UniqueName: \"kubernetes.io/projected/2f125ec9-331b-4128-85ce-9e03a7c28543-kube-api-access-6p64r\") pod \"redhat-operators-2dz9r\" (UID: \"2f125ec9-331b-4128-85ce-9e03a7c28543\") " pod="openshift-marketplace/redhat-operators-2dz9r" Dec 09 12:15:54 crc kubenswrapper[4745]: I1209 12:15:54.979609 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f125ec9-331b-4128-85ce-9e03a7c28543-utilities\") pod \"redhat-operators-2dz9r\" (UID: \"2f125ec9-331b-4128-85ce-9e03a7c28543\") " pod="openshift-marketplace/redhat-operators-2dz9r" Dec 09 12:15:54 crc kubenswrapper[4745]: I1209 12:15:54.980072 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f125ec9-331b-4128-85ce-9e03a7c28543-catalog-content\") pod \"redhat-operators-2dz9r\" (UID: \"2f125ec9-331b-4128-85ce-9e03a7c28543\") " pod="openshift-marketplace/redhat-operators-2dz9r" Dec 09 12:15:54 crc kubenswrapper[4745]: I1209 12:15:54.980177 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f125ec9-331b-4128-85ce-9e03a7c28543-utilities\") pod \"redhat-operators-2dz9r\" (UID: \"2f125ec9-331b-4128-85ce-9e03a7c28543\") " pod="openshift-marketplace/redhat-operators-2dz9r" Dec 09 12:15:55 crc kubenswrapper[4745]: I1209 12:15:55.019411 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p64r\" (UniqueName: \"kubernetes.io/projected/2f125ec9-331b-4128-85ce-9e03a7c28543-kube-api-access-6p64r\") pod \"redhat-operators-2dz9r\" (UID: \"2f125ec9-331b-4128-85ce-9e03a7c28543\") " pod="openshift-marketplace/redhat-operators-2dz9r" Dec 09 12:15:55 crc kubenswrapper[4745]: I1209 12:15:55.083866 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2dz9r" Dec 09 12:15:55 crc kubenswrapper[4745]: I1209 12:15:55.541041 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2dz9r"] Dec 09 12:15:56 crc kubenswrapper[4745]: I1209 12:15:56.401661 4745 generic.go:334] "Generic (PLEG): container finished" podID="2f125ec9-331b-4128-85ce-9e03a7c28543" containerID="780900cbe500656a71651fd9740d005499b8fd4a3feb469292915bb9c749da85" exitCode=0 Dec 09 12:15:56 crc kubenswrapper[4745]: I1209 12:15:56.401735 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dz9r" event={"ID":"2f125ec9-331b-4128-85ce-9e03a7c28543","Type":"ContainerDied","Data":"780900cbe500656a71651fd9740d005499b8fd4a3feb469292915bb9c749da85"} Dec 09 12:15:56 crc kubenswrapper[4745]: I1209 12:15:56.401970 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dz9r" event={"ID":"2f125ec9-331b-4128-85ce-9e03a7c28543","Type":"ContainerStarted","Data":"71ed59371047d84fb64518dcb5df71eaaf0e0a1512cb7a9a9c1ff7c16200ca21"} Dec 09 12:16:03 crc kubenswrapper[4745]: I1209 12:16:03.454900 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dz9r" event={"ID":"2f125ec9-331b-4128-85ce-9e03a7c28543","Type":"ContainerStarted","Data":"03be8d921acd8badea29b697c8fc158288ff21504e994c617bdc10c605e0098a"} Dec 09 12:16:04 crc kubenswrapper[4745]: I1209 12:16:04.464283 4745 generic.go:334] "Generic (PLEG): container finished" podID="2f125ec9-331b-4128-85ce-9e03a7c28543" containerID="03be8d921acd8badea29b697c8fc158288ff21504e994c617bdc10c605e0098a" exitCode=0 Dec 09 12:16:04 crc kubenswrapper[4745]: I1209 12:16:04.464378 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dz9r" event={"ID":"2f125ec9-331b-4128-85ce-9e03a7c28543","Type":"ContainerDied","Data":"03be8d921acd8badea29b697c8fc158288ff21504e994c617bdc10c605e0098a"} Dec 09 12:16:05 crc kubenswrapper[4745]: I1209 12:16:05.374218 4745 scope.go:117] "RemoveContainer" containerID="225ee8b0639543d5dff5ae33d4a25cd69f17c6e2179ae6e65a1ba3dc207d8c9a" Dec 09 12:16:05 crc kubenswrapper[4745]: I1209 12:16:05.476886 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dz9r" event={"ID":"2f125ec9-331b-4128-85ce-9e03a7c28543","Type":"ContainerStarted","Data":"c22c8c6b305b8151e9ab623dcc9b0a3e9e8cf468097e07b1be06ea2bfa610741"} Dec 09 12:16:05 crc kubenswrapper[4745]: I1209 12:16:05.501966 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2dz9r" podStartSLOduration=3.08316583 podStartE2EDuration="11.501930386s" podCreationTimestamp="2025-12-09 12:15:54 +0000 UTC" firstStartedPulling="2025-12-09 12:15:56.403628024 +0000 UTC m=+2643.228829538" lastFinishedPulling="2025-12-09 12:16:04.82239257 +0000 UTC m=+2651.647594094" observedRunningTime="2025-12-09 12:16:05.495953014 +0000 UTC m=+2652.321154538" watchObservedRunningTime="2025-12-09 12:16:05.501930386 +0000 UTC m=+2652.327131910" Dec 09 12:16:15 crc kubenswrapper[4745]: I1209 12:16:15.084941 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2dz9r" Dec 09 12:16:15 crc kubenswrapper[4745]: I1209 12:16:15.085452 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2dz9r" Dec 09 12:16:15 crc kubenswrapper[4745]: I1209 12:16:15.127863 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2dz9r" Dec 09 12:16:15 crc kubenswrapper[4745]: I1209 12:16:15.623562 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2dz9r" Dec 09 12:16:15 crc kubenswrapper[4745]: I1209 12:16:15.721581 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2dz9r"] Dec 09 12:16:15 crc kubenswrapper[4745]: I1209 12:16:15.756331 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7rxgj"] Dec 09 12:16:15 crc kubenswrapper[4745]: I1209 12:16:15.756636 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7rxgj" podUID="bb9b0d93-98de-42bc-99c2-18b0072da5b3" containerName="registry-server" containerID="cri-o://ed6da805d631cc91b5c088e71b1271d00273d41e60075197e15f9e8a2a022df5" gracePeriod=2 Dec 09 12:16:16 crc kubenswrapper[4745]: I1209 12:16:16.573818 4745 generic.go:334] "Generic (PLEG): container finished" podID="bb9b0d93-98de-42bc-99c2-18b0072da5b3" containerID="ed6da805d631cc91b5c088e71b1271d00273d41e60075197e15f9e8a2a022df5" exitCode=0 Dec 09 12:16:16 crc kubenswrapper[4745]: I1209 12:16:16.573956 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rxgj" event={"ID":"bb9b0d93-98de-42bc-99c2-18b0072da5b3","Type":"ContainerDied","Data":"ed6da805d631cc91b5c088e71b1271d00273d41e60075197e15f9e8a2a022df5"} Dec 09 12:16:16 crc kubenswrapper[4745]: I1209 12:16:16.669974 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 12:16:16 crc kubenswrapper[4745]: I1209 12:16:16.710500 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb9b0d93-98de-42bc-99c2-18b0072da5b3-catalog-content\") pod \"bb9b0d93-98de-42bc-99c2-18b0072da5b3\" (UID: \"bb9b0d93-98de-42bc-99c2-18b0072da5b3\") " Dec 09 12:16:16 crc kubenswrapper[4745]: I1209 12:16:16.710732 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb9b0d93-98de-42bc-99c2-18b0072da5b3-utilities\") pod \"bb9b0d93-98de-42bc-99c2-18b0072da5b3\" (UID: \"bb9b0d93-98de-42bc-99c2-18b0072da5b3\") " Dec 09 12:16:16 crc kubenswrapper[4745]: I1209 12:16:16.710778 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5kp8\" (UniqueName: \"kubernetes.io/projected/bb9b0d93-98de-42bc-99c2-18b0072da5b3-kube-api-access-f5kp8\") pod \"bb9b0d93-98de-42bc-99c2-18b0072da5b3\" (UID: \"bb9b0d93-98de-42bc-99c2-18b0072da5b3\") " Dec 09 12:16:16 crc kubenswrapper[4745]: I1209 12:16:16.711269 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb9b0d93-98de-42bc-99c2-18b0072da5b3-utilities" (OuterVolumeSpecName: "utilities") pod "bb9b0d93-98de-42bc-99c2-18b0072da5b3" (UID: "bb9b0d93-98de-42bc-99c2-18b0072da5b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:16:16 crc kubenswrapper[4745]: I1209 12:16:16.747266 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb9b0d93-98de-42bc-99c2-18b0072da5b3-kube-api-access-f5kp8" (OuterVolumeSpecName: "kube-api-access-f5kp8") pod "bb9b0d93-98de-42bc-99c2-18b0072da5b3" (UID: "bb9b0d93-98de-42bc-99c2-18b0072da5b3"). InnerVolumeSpecName "kube-api-access-f5kp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:16:16 crc kubenswrapper[4745]: I1209 12:16:16.811920 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb9b0d93-98de-42bc-99c2-18b0072da5b3-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:16:16 crc kubenswrapper[4745]: I1209 12:16:16.811968 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5kp8\" (UniqueName: \"kubernetes.io/projected/bb9b0d93-98de-42bc-99c2-18b0072da5b3-kube-api-access-f5kp8\") on node \"crc\" DevicePath \"\"" Dec 09 12:16:16 crc kubenswrapper[4745]: I1209 12:16:16.817439 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb9b0d93-98de-42bc-99c2-18b0072da5b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb9b0d93-98de-42bc-99c2-18b0072da5b3" (UID: "bb9b0d93-98de-42bc-99c2-18b0072da5b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:16:16 crc kubenswrapper[4745]: I1209 12:16:16.912870 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb9b0d93-98de-42bc-99c2-18b0072da5b3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:16:17 crc kubenswrapper[4745]: I1209 12:16:17.583138 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rxgj" event={"ID":"bb9b0d93-98de-42bc-99c2-18b0072da5b3","Type":"ContainerDied","Data":"5e909146a75e93721e9acf51669c5c283e82a75d1b69ee30af02f22cc94c3da6"} Dec 09 12:16:17 crc kubenswrapper[4745]: I1209 12:16:17.583198 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rxgj" Dec 09 12:16:17 crc kubenswrapper[4745]: I1209 12:16:17.583491 4745 scope.go:117] "RemoveContainer" containerID="ed6da805d631cc91b5c088e71b1271d00273d41e60075197e15f9e8a2a022df5" Dec 09 12:16:17 crc kubenswrapper[4745]: I1209 12:16:17.605776 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7rxgj"] Dec 09 12:16:17 crc kubenswrapper[4745]: I1209 12:16:17.611954 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7rxgj"] Dec 09 12:16:17 crc kubenswrapper[4745]: I1209 12:16:17.612653 4745 scope.go:117] "RemoveContainer" containerID="59dcb27350b95af5c2cb1b56c9aa07adb0e37da3d7f08e6735e90b7fb7c6c7e1" Dec 09 12:16:17 crc kubenswrapper[4745]: I1209 12:16:17.648618 4745 scope.go:117] "RemoveContainer" containerID="6fd5a80a3e58595a6d88e732ff921886f2ed061ce05c3a96a63d3a4ebfb9ab93" Dec 09 12:16:19 crc kubenswrapper[4745]: I1209 12:16:19.564113 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb9b0d93-98de-42bc-99c2-18b0072da5b3" path="/var/lib/kubelet/pods/bb9b0d93-98de-42bc-99c2-18b0072da5b3/volumes" Dec 09 12:16:25 crc kubenswrapper[4745]: I1209 12:16:25.491069 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:16:25 crc kubenswrapper[4745]: I1209 12:16:25.491836 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:16:55 crc kubenswrapper[4745]: I1209 12:16:55.475384 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:16:55 crc kubenswrapper[4745]: I1209 12:16:55.477166 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:17:25 crc kubenswrapper[4745]: I1209 12:17:25.478207 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:17:25 crc kubenswrapper[4745]: I1209 12:17:25.479645 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:17:25 crc kubenswrapper[4745]: I1209 12:17:25.479726 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 12:17:25 crc kubenswrapper[4745]: I1209 12:17:25.480460 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dbd850053b37cf6f0557953af263789c8beb1c8d932faa93d078bdd932ebe4f1"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:17:25 crc kubenswrapper[4745]: I1209 12:17:25.480601 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://dbd850053b37cf6f0557953af263789c8beb1c8d932faa93d078bdd932ebe4f1" gracePeriod=600 Dec 09 12:17:26 crc kubenswrapper[4745]: I1209 12:17:26.213103 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="dbd850053b37cf6f0557953af263789c8beb1c8d932faa93d078bdd932ebe4f1" exitCode=0 Dec 09 12:17:26 crc kubenswrapper[4745]: I1209 12:17:26.213168 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"dbd850053b37cf6f0557953af263789c8beb1c8d932faa93d078bdd932ebe4f1"} Dec 09 12:17:26 crc kubenswrapper[4745]: I1209 12:17:26.213429 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1"} Dec 09 12:17:26 crc kubenswrapper[4745]: I1209 12:17:26.213469 4745 scope.go:117] "RemoveContainer" containerID="a0a1dfc63909d3e9404b82fba2f329a6b48fe78bf9d0b421d2121eba1bb6ecfc" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.346701 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k9jdh"] Dec 09 12:17:56 crc kubenswrapper[4745]: E1209 12:17:56.347501 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb9b0d93-98de-42bc-99c2-18b0072da5b3" containerName="extract-content" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.347539 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb9b0d93-98de-42bc-99c2-18b0072da5b3" containerName="extract-content" Dec 09 12:17:56 crc kubenswrapper[4745]: E1209 12:17:56.347561 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb9b0d93-98de-42bc-99c2-18b0072da5b3" containerName="extract-utilities" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.347566 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb9b0d93-98de-42bc-99c2-18b0072da5b3" containerName="extract-utilities" Dec 09 12:17:56 crc kubenswrapper[4745]: E1209 12:17:56.347581 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb9b0d93-98de-42bc-99c2-18b0072da5b3" containerName="registry-server" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.347587 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb9b0d93-98de-42bc-99c2-18b0072da5b3" containerName="registry-server" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.347764 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb9b0d93-98de-42bc-99c2-18b0072da5b3" containerName="registry-server" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.348894 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.361280 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k9jdh"] Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.401647 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/385944da-08ae-4cf0-a483-5b52ae162af9-catalog-content\") pod \"certified-operators-k9jdh\" (UID: \"385944da-08ae-4cf0-a483-5b52ae162af9\") " pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.402083 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/385944da-08ae-4cf0-a483-5b52ae162af9-utilities\") pod \"certified-operators-k9jdh\" (UID: \"385944da-08ae-4cf0-a483-5b52ae162af9\") " pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.402170 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zstr6\" (UniqueName: \"kubernetes.io/projected/385944da-08ae-4cf0-a483-5b52ae162af9-kube-api-access-zstr6\") pod \"certified-operators-k9jdh\" (UID: \"385944da-08ae-4cf0-a483-5b52ae162af9\") " pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.504253 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/385944da-08ae-4cf0-a483-5b52ae162af9-catalog-content\") pod \"certified-operators-k9jdh\" (UID: \"385944da-08ae-4cf0-a483-5b52ae162af9\") " pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.504372 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/385944da-08ae-4cf0-a483-5b52ae162af9-utilities\") pod \"certified-operators-k9jdh\" (UID: \"385944da-08ae-4cf0-a483-5b52ae162af9\") " pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.504405 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zstr6\" (UniqueName: \"kubernetes.io/projected/385944da-08ae-4cf0-a483-5b52ae162af9-kube-api-access-zstr6\") pod \"certified-operators-k9jdh\" (UID: \"385944da-08ae-4cf0-a483-5b52ae162af9\") " pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.505008 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/385944da-08ae-4cf0-a483-5b52ae162af9-utilities\") pod \"certified-operators-k9jdh\" (UID: \"385944da-08ae-4cf0-a483-5b52ae162af9\") " pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.505068 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/385944da-08ae-4cf0-a483-5b52ae162af9-catalog-content\") pod \"certified-operators-k9jdh\" (UID: \"385944da-08ae-4cf0-a483-5b52ae162af9\") " pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.526276 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zstr6\" (UniqueName: \"kubernetes.io/projected/385944da-08ae-4cf0-a483-5b52ae162af9-kube-api-access-zstr6\") pod \"certified-operators-k9jdh\" (UID: \"385944da-08ae-4cf0-a483-5b52ae162af9\") " pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.668644 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:17:56 crc kubenswrapper[4745]: I1209 12:17:56.961958 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k9jdh"] Dec 09 12:17:57 crc kubenswrapper[4745]: I1209 12:17:57.446148 4745 generic.go:334] "Generic (PLEG): container finished" podID="385944da-08ae-4cf0-a483-5b52ae162af9" containerID="5b4d04e64e6dc1d1fd3dffbf246ff9b6b907e663d90f4ed2cc4da61739433b0a" exitCode=0 Dec 09 12:17:57 crc kubenswrapper[4745]: I1209 12:17:57.446273 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9jdh" event={"ID":"385944da-08ae-4cf0-a483-5b52ae162af9","Type":"ContainerDied","Data":"5b4d04e64e6dc1d1fd3dffbf246ff9b6b907e663d90f4ed2cc4da61739433b0a"} Dec 09 12:17:57 crc kubenswrapper[4745]: I1209 12:17:57.446486 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9jdh" event={"ID":"385944da-08ae-4cf0-a483-5b52ae162af9","Type":"ContainerStarted","Data":"377c9075ba5be65b5c68e5785f38264b1af7b907566a4bc1988df4192d4751f0"} Dec 09 12:17:58 crc kubenswrapper[4745]: I1209 12:17:58.455204 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9jdh" event={"ID":"385944da-08ae-4cf0-a483-5b52ae162af9","Type":"ContainerStarted","Data":"c7d7c626f1d9818f759425bf8af6b49e2d55db883593f7bde98670f4b2b46836"} Dec 09 12:17:59 crc kubenswrapper[4745]: I1209 12:17:59.462962 4745 generic.go:334] "Generic (PLEG): container finished" podID="385944da-08ae-4cf0-a483-5b52ae162af9" containerID="c7d7c626f1d9818f759425bf8af6b49e2d55db883593f7bde98670f4b2b46836" exitCode=0 Dec 09 12:17:59 crc kubenswrapper[4745]: I1209 12:17:59.463014 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9jdh" event={"ID":"385944da-08ae-4cf0-a483-5b52ae162af9","Type":"ContainerDied","Data":"c7d7c626f1d9818f759425bf8af6b49e2d55db883593f7bde98670f4b2b46836"} Dec 09 12:18:00 crc kubenswrapper[4745]: I1209 12:18:00.473265 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9jdh" event={"ID":"385944da-08ae-4cf0-a483-5b52ae162af9","Type":"ContainerStarted","Data":"173ee697c919ed23b718a5eaba80bf90376582091bcd292c2a9e8bde64dfbeb3"} Dec 09 12:18:00 crc kubenswrapper[4745]: I1209 12:18:00.492934 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k9jdh" podStartSLOduration=2.080734734 podStartE2EDuration="4.492901472s" podCreationTimestamp="2025-12-09 12:17:56 +0000 UTC" firstStartedPulling="2025-12-09 12:17:57.449774337 +0000 UTC m=+2764.274975861" lastFinishedPulling="2025-12-09 12:17:59.861941075 +0000 UTC m=+2766.687142599" observedRunningTime="2025-12-09 12:18:00.489279195 +0000 UTC m=+2767.314480729" watchObservedRunningTime="2025-12-09 12:18:00.492901472 +0000 UTC m=+2767.318102996" Dec 09 12:18:06 crc kubenswrapper[4745]: I1209 12:18:06.669864 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:18:06 crc kubenswrapper[4745]: I1209 12:18:06.670443 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:18:06 crc kubenswrapper[4745]: I1209 12:18:06.713380 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:18:07 crc kubenswrapper[4745]: I1209 12:18:07.574486 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:18:07 crc kubenswrapper[4745]: I1209 12:18:07.627272 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k9jdh"] Dec 09 12:18:09 crc kubenswrapper[4745]: I1209 12:18:09.534394 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k9jdh" podUID="385944da-08ae-4cf0-a483-5b52ae162af9" containerName="registry-server" containerID="cri-o://173ee697c919ed23b718a5eaba80bf90376582091bcd292c2a9e8bde64dfbeb3" gracePeriod=2 Dec 09 12:18:11 crc kubenswrapper[4745]: I1209 12:18:11.570016 4745 generic.go:334] "Generic (PLEG): container finished" podID="385944da-08ae-4cf0-a483-5b52ae162af9" containerID="173ee697c919ed23b718a5eaba80bf90376582091bcd292c2a9e8bde64dfbeb3" exitCode=0 Dec 09 12:18:11 crc kubenswrapper[4745]: I1209 12:18:11.572759 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9jdh" event={"ID":"385944da-08ae-4cf0-a483-5b52ae162af9","Type":"ContainerDied","Data":"173ee697c919ed23b718a5eaba80bf90376582091bcd292c2a9e8bde64dfbeb3"} Dec 09 12:18:11 crc kubenswrapper[4745]: I1209 12:18:11.786864 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:18:11 crc kubenswrapper[4745]: I1209 12:18:11.821993 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zstr6\" (UniqueName: \"kubernetes.io/projected/385944da-08ae-4cf0-a483-5b52ae162af9-kube-api-access-zstr6\") pod \"385944da-08ae-4cf0-a483-5b52ae162af9\" (UID: \"385944da-08ae-4cf0-a483-5b52ae162af9\") " Dec 09 12:18:11 crc kubenswrapper[4745]: I1209 12:18:11.822045 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/385944da-08ae-4cf0-a483-5b52ae162af9-utilities\") pod \"385944da-08ae-4cf0-a483-5b52ae162af9\" (UID: \"385944da-08ae-4cf0-a483-5b52ae162af9\") " Dec 09 12:18:11 crc kubenswrapper[4745]: I1209 12:18:11.822095 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/385944da-08ae-4cf0-a483-5b52ae162af9-catalog-content\") pod \"385944da-08ae-4cf0-a483-5b52ae162af9\" (UID: \"385944da-08ae-4cf0-a483-5b52ae162af9\") " Dec 09 12:18:11 crc kubenswrapper[4745]: I1209 12:18:11.823645 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/385944da-08ae-4cf0-a483-5b52ae162af9-utilities" (OuterVolumeSpecName: "utilities") pod "385944da-08ae-4cf0-a483-5b52ae162af9" (UID: "385944da-08ae-4cf0-a483-5b52ae162af9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:18:11 crc kubenswrapper[4745]: I1209 12:18:11.829043 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/385944da-08ae-4cf0-a483-5b52ae162af9-kube-api-access-zstr6" (OuterVolumeSpecName: "kube-api-access-zstr6") pod "385944da-08ae-4cf0-a483-5b52ae162af9" (UID: "385944da-08ae-4cf0-a483-5b52ae162af9"). InnerVolumeSpecName "kube-api-access-zstr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:18:11 crc kubenswrapper[4745]: I1209 12:18:11.874328 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/385944da-08ae-4cf0-a483-5b52ae162af9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "385944da-08ae-4cf0-a483-5b52ae162af9" (UID: "385944da-08ae-4cf0-a483-5b52ae162af9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:18:11 crc kubenswrapper[4745]: I1209 12:18:11.923770 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zstr6\" (UniqueName: \"kubernetes.io/projected/385944da-08ae-4cf0-a483-5b52ae162af9-kube-api-access-zstr6\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:11 crc kubenswrapper[4745]: I1209 12:18:11.923817 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/385944da-08ae-4cf0-a483-5b52ae162af9-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:11 crc kubenswrapper[4745]: I1209 12:18:11.923827 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/385944da-08ae-4cf0-a483-5b52ae162af9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:12 crc kubenswrapper[4745]: I1209 12:18:12.578285 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9jdh" event={"ID":"385944da-08ae-4cf0-a483-5b52ae162af9","Type":"ContainerDied","Data":"377c9075ba5be65b5c68e5785f38264b1af7b907566a4bc1988df4192d4751f0"} Dec 09 12:18:12 crc kubenswrapper[4745]: I1209 12:18:12.578351 4745 scope.go:117] "RemoveContainer" containerID="173ee697c919ed23b718a5eaba80bf90376582091bcd292c2a9e8bde64dfbeb3" Dec 09 12:18:12 crc kubenswrapper[4745]: I1209 12:18:12.578352 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9jdh" Dec 09 12:18:12 crc kubenswrapper[4745]: I1209 12:18:12.603011 4745 scope.go:117] "RemoveContainer" containerID="c7d7c626f1d9818f759425bf8af6b49e2d55db883593f7bde98670f4b2b46836" Dec 09 12:18:12 crc kubenswrapper[4745]: I1209 12:18:12.612689 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k9jdh"] Dec 09 12:18:12 crc kubenswrapper[4745]: I1209 12:18:12.617887 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k9jdh"] Dec 09 12:18:12 crc kubenswrapper[4745]: I1209 12:18:12.646486 4745 scope.go:117] "RemoveContainer" containerID="5b4d04e64e6dc1d1fd3dffbf246ff9b6b907e663d90f4ed2cc4da61739433b0a" Dec 09 12:18:13 crc kubenswrapper[4745]: I1209 12:18:13.566169 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="385944da-08ae-4cf0-a483-5b52ae162af9" path="/var/lib/kubelet/pods/385944da-08ae-4cf0-a483-5b52ae162af9/volumes" Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.479699 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4qj4h"] Dec 09 12:18:27 crc kubenswrapper[4745]: E1209 12:18:27.480544 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385944da-08ae-4cf0-a483-5b52ae162af9" containerName="extract-content" Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.480559 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="385944da-08ae-4cf0-a483-5b52ae162af9" containerName="extract-content" Dec 09 12:18:27 crc kubenswrapper[4745]: E1209 12:18:27.480574 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385944da-08ae-4cf0-a483-5b52ae162af9" containerName="registry-server" Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.480582 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="385944da-08ae-4cf0-a483-5b52ae162af9" containerName="registry-server" Dec 09 12:18:27 crc kubenswrapper[4745]: E1209 12:18:27.480601 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385944da-08ae-4cf0-a483-5b52ae162af9" containerName="extract-utilities" Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.480610 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="385944da-08ae-4cf0-a483-5b52ae162af9" containerName="extract-utilities" Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.480819 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="385944da-08ae-4cf0-a483-5b52ae162af9" containerName="registry-server" Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.482059 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.497706 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4qj4h"] Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.631194 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65b46aeb-dc53-4c43-83ec-e71c68f272a6-catalog-content\") pod \"community-operators-4qj4h\" (UID: \"65b46aeb-dc53-4c43-83ec-e71c68f272a6\") " pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.631238 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrjp8\" (UniqueName: \"kubernetes.io/projected/65b46aeb-dc53-4c43-83ec-e71c68f272a6-kube-api-access-hrjp8\") pod \"community-operators-4qj4h\" (UID: \"65b46aeb-dc53-4c43-83ec-e71c68f272a6\") " pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.631268 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65b46aeb-dc53-4c43-83ec-e71c68f272a6-utilities\") pod \"community-operators-4qj4h\" (UID: \"65b46aeb-dc53-4c43-83ec-e71c68f272a6\") " pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.732873 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65b46aeb-dc53-4c43-83ec-e71c68f272a6-catalog-content\") pod \"community-operators-4qj4h\" (UID: \"65b46aeb-dc53-4c43-83ec-e71c68f272a6\") " pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.733230 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrjp8\" (UniqueName: \"kubernetes.io/projected/65b46aeb-dc53-4c43-83ec-e71c68f272a6-kube-api-access-hrjp8\") pod \"community-operators-4qj4h\" (UID: \"65b46aeb-dc53-4c43-83ec-e71c68f272a6\") " pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.733379 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65b46aeb-dc53-4c43-83ec-e71c68f272a6-utilities\") pod \"community-operators-4qj4h\" (UID: \"65b46aeb-dc53-4c43-83ec-e71c68f272a6\") " pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.733387 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65b46aeb-dc53-4c43-83ec-e71c68f272a6-catalog-content\") pod \"community-operators-4qj4h\" (UID: \"65b46aeb-dc53-4c43-83ec-e71c68f272a6\") " pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.734037 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65b46aeb-dc53-4c43-83ec-e71c68f272a6-utilities\") pod \"community-operators-4qj4h\" (UID: \"65b46aeb-dc53-4c43-83ec-e71c68f272a6\") " pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.753280 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrjp8\" (UniqueName: \"kubernetes.io/projected/65b46aeb-dc53-4c43-83ec-e71c68f272a6-kube-api-access-hrjp8\") pod \"community-operators-4qj4h\" (UID: \"65b46aeb-dc53-4c43-83ec-e71c68f272a6\") " pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:27 crc kubenswrapper[4745]: I1209 12:18:27.802422 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:28 crc kubenswrapper[4745]: I1209 12:18:28.301690 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4qj4h"] Dec 09 12:18:28 crc kubenswrapper[4745]: W1209 12:18:28.310423 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65b46aeb_dc53_4c43_83ec_e71c68f272a6.slice/crio-f18f3015868bea673b4bd76723621441bcf54cdfb9f20c494cee75b0cbb91698 WatchSource:0}: Error finding container f18f3015868bea673b4bd76723621441bcf54cdfb9f20c494cee75b0cbb91698: Status 404 returned error can't find the container with id f18f3015868bea673b4bd76723621441bcf54cdfb9f20c494cee75b0cbb91698 Dec 09 12:18:28 crc kubenswrapper[4745]: I1209 12:18:28.692882 4745 generic.go:334] "Generic (PLEG): container finished" podID="65b46aeb-dc53-4c43-83ec-e71c68f272a6" containerID="e8a7bc7467bb539102fe74ba55d534b872b0aafac46fd3df1eb53125da890d92" exitCode=0 Dec 09 12:18:28 crc kubenswrapper[4745]: I1209 12:18:28.692930 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qj4h" event={"ID":"65b46aeb-dc53-4c43-83ec-e71c68f272a6","Type":"ContainerDied","Data":"e8a7bc7467bb539102fe74ba55d534b872b0aafac46fd3df1eb53125da890d92"} Dec 09 12:18:28 crc kubenswrapper[4745]: I1209 12:18:28.692962 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qj4h" event={"ID":"65b46aeb-dc53-4c43-83ec-e71c68f272a6","Type":"ContainerStarted","Data":"f18f3015868bea673b4bd76723621441bcf54cdfb9f20c494cee75b0cbb91698"} Dec 09 12:18:30 crc kubenswrapper[4745]: I1209 12:18:30.705857 4745 generic.go:334] "Generic (PLEG): container finished" podID="65b46aeb-dc53-4c43-83ec-e71c68f272a6" containerID="c827f0e9a1ce140a62d6458c77e5c813d3d8a5f48ece33bcc416366dc500a8bc" exitCode=0 Dec 09 12:18:30 crc kubenswrapper[4745]: I1209 12:18:30.705959 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qj4h" event={"ID":"65b46aeb-dc53-4c43-83ec-e71c68f272a6","Type":"ContainerDied","Data":"c827f0e9a1ce140a62d6458c77e5c813d3d8a5f48ece33bcc416366dc500a8bc"} Dec 09 12:18:31 crc kubenswrapper[4745]: I1209 12:18:31.714255 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qj4h" event={"ID":"65b46aeb-dc53-4c43-83ec-e71c68f272a6","Type":"ContainerStarted","Data":"9b549ffefaf403fd99b1afac3a7d0ac2c1550ad9ebc997b49ab363cdde7017ec"} Dec 09 12:18:31 crc kubenswrapper[4745]: I1209 12:18:31.736944 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4qj4h" podStartSLOduration=2.179697021 podStartE2EDuration="4.736919171s" podCreationTimestamp="2025-12-09 12:18:27 +0000 UTC" firstStartedPulling="2025-12-09 12:18:28.69432842 +0000 UTC m=+2795.519529954" lastFinishedPulling="2025-12-09 12:18:31.25155057 +0000 UTC m=+2798.076752104" observedRunningTime="2025-12-09 12:18:31.730645432 +0000 UTC m=+2798.555846966" watchObservedRunningTime="2025-12-09 12:18:31.736919171 +0000 UTC m=+2798.562120695" Dec 09 12:18:37 crc kubenswrapper[4745]: I1209 12:18:37.802789 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:37 crc kubenswrapper[4745]: I1209 12:18:37.804588 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:37 crc kubenswrapper[4745]: I1209 12:18:37.854816 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:38 crc kubenswrapper[4745]: I1209 12:18:38.804213 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:38 crc kubenswrapper[4745]: I1209 12:18:38.847207 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4qj4h"] Dec 09 12:18:41 crc kubenswrapper[4745]: I1209 12:18:41.413713 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4qj4h" podUID="65b46aeb-dc53-4c43-83ec-e71c68f272a6" containerName="registry-server" containerID="cri-o://9b549ffefaf403fd99b1afac3a7d0ac2c1550ad9ebc997b49ab363cdde7017ec" gracePeriod=2 Dec 09 12:18:42 crc kubenswrapper[4745]: I1209 12:18:42.422030 4745 generic.go:334] "Generic (PLEG): container finished" podID="65b46aeb-dc53-4c43-83ec-e71c68f272a6" containerID="9b549ffefaf403fd99b1afac3a7d0ac2c1550ad9ebc997b49ab363cdde7017ec" exitCode=0 Dec 09 12:18:42 crc kubenswrapper[4745]: I1209 12:18:42.422083 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qj4h" event={"ID":"65b46aeb-dc53-4c43-83ec-e71c68f272a6","Type":"ContainerDied","Data":"9b549ffefaf403fd99b1afac3a7d0ac2c1550ad9ebc997b49ab363cdde7017ec"} Dec 09 12:18:42 crc kubenswrapper[4745]: I1209 12:18:42.923722 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.117089 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrjp8\" (UniqueName: \"kubernetes.io/projected/65b46aeb-dc53-4c43-83ec-e71c68f272a6-kube-api-access-hrjp8\") pod \"65b46aeb-dc53-4c43-83ec-e71c68f272a6\" (UID: \"65b46aeb-dc53-4c43-83ec-e71c68f272a6\") " Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.117281 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65b46aeb-dc53-4c43-83ec-e71c68f272a6-utilities\") pod \"65b46aeb-dc53-4c43-83ec-e71c68f272a6\" (UID: \"65b46aeb-dc53-4c43-83ec-e71c68f272a6\") " Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.117301 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65b46aeb-dc53-4c43-83ec-e71c68f272a6-catalog-content\") pod \"65b46aeb-dc53-4c43-83ec-e71c68f272a6\" (UID: \"65b46aeb-dc53-4c43-83ec-e71c68f272a6\") " Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.118192 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65b46aeb-dc53-4c43-83ec-e71c68f272a6-utilities" (OuterVolumeSpecName: "utilities") pod "65b46aeb-dc53-4c43-83ec-e71c68f272a6" (UID: "65b46aeb-dc53-4c43-83ec-e71c68f272a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.122801 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b46aeb-dc53-4c43-83ec-e71c68f272a6-kube-api-access-hrjp8" (OuterVolumeSpecName: "kube-api-access-hrjp8") pod "65b46aeb-dc53-4c43-83ec-e71c68f272a6" (UID: "65b46aeb-dc53-4c43-83ec-e71c68f272a6"). InnerVolumeSpecName "kube-api-access-hrjp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.166903 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65b46aeb-dc53-4c43-83ec-e71c68f272a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65b46aeb-dc53-4c43-83ec-e71c68f272a6" (UID: "65b46aeb-dc53-4c43-83ec-e71c68f272a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.219047 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65b46aeb-dc53-4c43-83ec-e71c68f272a6-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.219086 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65b46aeb-dc53-4c43-83ec-e71c68f272a6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.219101 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrjp8\" (UniqueName: \"kubernetes.io/projected/65b46aeb-dc53-4c43-83ec-e71c68f272a6-kube-api-access-hrjp8\") on node \"crc\" DevicePath \"\"" Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.430774 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qj4h" event={"ID":"65b46aeb-dc53-4c43-83ec-e71c68f272a6","Type":"ContainerDied","Data":"f18f3015868bea673b4bd76723621441bcf54cdfb9f20c494cee75b0cbb91698"} Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.430804 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4qj4h" Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.430828 4745 scope.go:117] "RemoveContainer" containerID="9b549ffefaf403fd99b1afac3a7d0ac2c1550ad9ebc997b49ab363cdde7017ec" Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.451087 4745 scope.go:117] "RemoveContainer" containerID="c827f0e9a1ce140a62d6458c77e5c813d3d8a5f48ece33bcc416366dc500a8bc" Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.466658 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4qj4h"] Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.470559 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4qj4h"] Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.483449 4745 scope.go:117] "RemoveContainer" containerID="e8a7bc7467bb539102fe74ba55d534b872b0aafac46fd3df1eb53125da890d92" Dec 09 12:18:43 crc kubenswrapper[4745]: I1209 12:18:43.564180 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65b46aeb-dc53-4c43-83ec-e71c68f272a6" path="/var/lib/kubelet/pods/65b46aeb-dc53-4c43-83ec-e71c68f272a6/volumes" Dec 09 12:19:25 crc kubenswrapper[4745]: I1209 12:19:25.475001 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:19:25 crc kubenswrapper[4745]: I1209 12:19:25.476723 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:19:55 crc kubenswrapper[4745]: I1209 12:19:55.475992 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:19:55 crc kubenswrapper[4745]: I1209 12:19:55.477222 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:20:25 crc kubenswrapper[4745]: I1209 12:20:25.475784 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:20:25 crc kubenswrapper[4745]: I1209 12:20:25.476409 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:20:25 crc kubenswrapper[4745]: I1209 12:20:25.476472 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 12:20:25 crc kubenswrapper[4745]: I1209 12:20:25.477049 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:20:25 crc kubenswrapper[4745]: I1209 12:20:25.477171 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" gracePeriod=600 Dec 09 12:20:25 crc kubenswrapper[4745]: E1209 12:20:25.602063 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:20:26 crc kubenswrapper[4745]: I1209 12:20:26.509290 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" exitCode=0 Dec 09 12:20:26 crc kubenswrapper[4745]: I1209 12:20:26.509386 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1"} Dec 09 12:20:26 crc kubenswrapper[4745]: I1209 12:20:26.509694 4745 scope.go:117] "RemoveContainer" containerID="dbd850053b37cf6f0557953af263789c8beb1c8d932faa93d078bdd932ebe4f1" Dec 09 12:20:26 crc kubenswrapper[4745]: I1209 12:20:26.510332 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:20:26 crc kubenswrapper[4745]: E1209 12:20:26.510569 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:20:40 crc kubenswrapper[4745]: I1209 12:20:40.555669 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:20:40 crc kubenswrapper[4745]: E1209 12:20:40.556657 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:20:51 crc kubenswrapper[4745]: I1209 12:20:51.555476 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:20:51 crc kubenswrapper[4745]: E1209 12:20:51.556374 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:21:06 crc kubenswrapper[4745]: I1209 12:21:06.555016 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:21:06 crc kubenswrapper[4745]: E1209 12:21:06.555506 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:21:18 crc kubenswrapper[4745]: I1209 12:21:18.555592 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:21:18 crc kubenswrapper[4745]: E1209 12:21:18.556345 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:21:32 crc kubenswrapper[4745]: I1209 12:21:32.555631 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:21:32 crc kubenswrapper[4745]: E1209 12:21:32.556299 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:21:43 crc kubenswrapper[4745]: I1209 12:21:43.560239 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:21:43 crc kubenswrapper[4745]: E1209 12:21:43.561074 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:21:58 crc kubenswrapper[4745]: I1209 12:21:58.554921 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:21:58 crc kubenswrapper[4745]: E1209 12:21:58.555694 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:22:13 crc kubenswrapper[4745]: I1209 12:22:13.564371 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:22:13 crc kubenswrapper[4745]: E1209 12:22:13.565352 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:22:24 crc kubenswrapper[4745]: I1209 12:22:24.554298 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:22:24 crc kubenswrapper[4745]: E1209 12:22:24.556040 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:22:37 crc kubenswrapper[4745]: I1209 12:22:37.555257 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:22:37 crc kubenswrapper[4745]: E1209 12:22:37.555850 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:22:48 crc kubenswrapper[4745]: I1209 12:22:48.555643 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:22:48 crc kubenswrapper[4745]: E1209 12:22:48.556470 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:22:59 crc kubenswrapper[4745]: I1209 12:22:59.555645 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:22:59 crc kubenswrapper[4745]: E1209 12:22:59.556384 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:23:14 crc kubenswrapper[4745]: I1209 12:23:14.555614 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:23:14 crc kubenswrapper[4745]: E1209 12:23:14.556437 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:23:28 crc kubenswrapper[4745]: I1209 12:23:28.554803 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:23:28 crc kubenswrapper[4745]: E1209 12:23:28.555441 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:23:39 crc kubenswrapper[4745]: I1209 12:23:39.561826 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:23:39 crc kubenswrapper[4745]: E1209 12:23:39.563137 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.478376 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b4k5t"] Dec 09 12:23:48 crc kubenswrapper[4745]: E1209 12:23:48.479291 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b46aeb-dc53-4c43-83ec-e71c68f272a6" containerName="registry-server" Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.479309 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b46aeb-dc53-4c43-83ec-e71c68f272a6" containerName="registry-server" Dec 09 12:23:48 crc kubenswrapper[4745]: E1209 12:23:48.479326 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b46aeb-dc53-4c43-83ec-e71c68f272a6" containerName="extract-content" Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.479334 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b46aeb-dc53-4c43-83ec-e71c68f272a6" containerName="extract-content" Dec 09 12:23:48 crc kubenswrapper[4745]: E1209 12:23:48.479356 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b46aeb-dc53-4c43-83ec-e71c68f272a6" containerName="extract-utilities" Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.479366 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b46aeb-dc53-4c43-83ec-e71c68f272a6" containerName="extract-utilities" Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.479709 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b46aeb-dc53-4c43-83ec-e71c68f272a6" containerName="registry-server" Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.481164 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.494015 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b4k5t"] Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.620568 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3361ce60-925b-41ff-8582-3156c403eaaf-catalog-content\") pod \"redhat-marketplace-b4k5t\" (UID: \"3361ce60-925b-41ff-8582-3156c403eaaf\") " pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.620626 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shnvt\" (UniqueName: \"kubernetes.io/projected/3361ce60-925b-41ff-8582-3156c403eaaf-kube-api-access-shnvt\") pod \"redhat-marketplace-b4k5t\" (UID: \"3361ce60-925b-41ff-8582-3156c403eaaf\") " pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.620802 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3361ce60-925b-41ff-8582-3156c403eaaf-utilities\") pod \"redhat-marketplace-b4k5t\" (UID: \"3361ce60-925b-41ff-8582-3156c403eaaf\") " pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.722446 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3361ce60-925b-41ff-8582-3156c403eaaf-catalog-content\") pod \"redhat-marketplace-b4k5t\" (UID: \"3361ce60-925b-41ff-8582-3156c403eaaf\") " pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.722522 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shnvt\" (UniqueName: \"kubernetes.io/projected/3361ce60-925b-41ff-8582-3156c403eaaf-kube-api-access-shnvt\") pod \"redhat-marketplace-b4k5t\" (UID: \"3361ce60-925b-41ff-8582-3156c403eaaf\") " pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.722618 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3361ce60-925b-41ff-8582-3156c403eaaf-utilities\") pod \"redhat-marketplace-b4k5t\" (UID: \"3361ce60-925b-41ff-8582-3156c403eaaf\") " pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.723106 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3361ce60-925b-41ff-8582-3156c403eaaf-catalog-content\") pod \"redhat-marketplace-b4k5t\" (UID: \"3361ce60-925b-41ff-8582-3156c403eaaf\") " pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.723219 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3361ce60-925b-41ff-8582-3156c403eaaf-utilities\") pod \"redhat-marketplace-b4k5t\" (UID: \"3361ce60-925b-41ff-8582-3156c403eaaf\") " pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.750665 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shnvt\" (UniqueName: \"kubernetes.io/projected/3361ce60-925b-41ff-8582-3156c403eaaf-kube-api-access-shnvt\") pod \"redhat-marketplace-b4k5t\" (UID: \"3361ce60-925b-41ff-8582-3156c403eaaf\") " pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:23:48 crc kubenswrapper[4745]: I1209 12:23:48.844331 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:23:49 crc kubenswrapper[4745]: I1209 12:23:49.278930 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b4k5t"] Dec 09 12:23:50 crc kubenswrapper[4745]: I1209 12:23:50.102632 4745 generic.go:334] "Generic (PLEG): container finished" podID="3361ce60-925b-41ff-8582-3156c403eaaf" containerID="fc0f799c5256fe74128012a46e336885dd8a92ab79e5f1fc62360011e6a71818" exitCode=0 Dec 09 12:23:50 crc kubenswrapper[4745]: I1209 12:23:50.102766 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4k5t" event={"ID":"3361ce60-925b-41ff-8582-3156c403eaaf","Type":"ContainerDied","Data":"fc0f799c5256fe74128012a46e336885dd8a92ab79e5f1fc62360011e6a71818"} Dec 09 12:23:50 crc kubenswrapper[4745]: I1209 12:23:50.102950 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4k5t" event={"ID":"3361ce60-925b-41ff-8582-3156c403eaaf","Type":"ContainerStarted","Data":"74e7eeface33a0ca838767ed70cab3caecba0ce695573c95284bb8da05209dcb"} Dec 09 12:23:50 crc kubenswrapper[4745]: I1209 12:23:50.104610 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:23:52 crc kubenswrapper[4745]: I1209 12:23:52.124836 4745 generic.go:334] "Generic (PLEG): container finished" podID="3361ce60-925b-41ff-8582-3156c403eaaf" containerID="15f5e89c1ba13e214053368c0ff6d8a34ba1c4b093ba5954d3bfe1846ec089c5" exitCode=0 Dec 09 12:23:52 crc kubenswrapper[4745]: I1209 12:23:52.125373 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4k5t" event={"ID":"3361ce60-925b-41ff-8582-3156c403eaaf","Type":"ContainerDied","Data":"15f5e89c1ba13e214053368c0ff6d8a34ba1c4b093ba5954d3bfe1846ec089c5"} Dec 09 12:23:53 crc kubenswrapper[4745]: I1209 12:23:53.144882 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4k5t" event={"ID":"3361ce60-925b-41ff-8582-3156c403eaaf","Type":"ContainerStarted","Data":"1c87498eb44870edf59801bc80b522ce3bb267b49db7132fd4789168aec06590"} Dec 09 12:23:53 crc kubenswrapper[4745]: I1209 12:23:53.169380 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b4k5t" podStartSLOduration=2.721782482 podStartE2EDuration="5.16935792s" podCreationTimestamp="2025-12-09 12:23:48 +0000 UTC" firstStartedPulling="2025-12-09 12:23:50.104347213 +0000 UTC m=+3116.929548737" lastFinishedPulling="2025-12-09 12:23:52.551922651 +0000 UTC m=+3119.377124175" observedRunningTime="2025-12-09 12:23:53.162424923 +0000 UTC m=+3119.987626467" watchObservedRunningTime="2025-12-09 12:23:53.16935792 +0000 UTC m=+3119.994559444" Dec 09 12:23:53 crc kubenswrapper[4745]: I1209 12:23:53.566445 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:23:53 crc kubenswrapper[4745]: E1209 12:23:53.566727 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:23:58 crc kubenswrapper[4745]: I1209 12:23:58.844639 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:23:58 crc kubenswrapper[4745]: I1209 12:23:58.845042 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:23:58 crc kubenswrapper[4745]: I1209 12:23:58.904174 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:23:59 crc kubenswrapper[4745]: I1209 12:23:59.241207 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:23:59 crc kubenswrapper[4745]: I1209 12:23:59.293830 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b4k5t"] Dec 09 12:24:01 crc kubenswrapper[4745]: I1209 12:24:01.203629 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b4k5t" podUID="3361ce60-925b-41ff-8582-3156c403eaaf" containerName="registry-server" containerID="cri-o://1c87498eb44870edf59801bc80b522ce3bb267b49db7132fd4789168aec06590" gracePeriod=2 Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.084178 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.213555 4745 generic.go:334] "Generic (PLEG): container finished" podID="3361ce60-925b-41ff-8582-3156c403eaaf" containerID="1c87498eb44870edf59801bc80b522ce3bb267b49db7132fd4789168aec06590" exitCode=0 Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.213622 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4k5t" event={"ID":"3361ce60-925b-41ff-8582-3156c403eaaf","Type":"ContainerDied","Data":"1c87498eb44870edf59801bc80b522ce3bb267b49db7132fd4789168aec06590"} Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.213692 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4k5t" event={"ID":"3361ce60-925b-41ff-8582-3156c403eaaf","Type":"ContainerDied","Data":"74e7eeface33a0ca838767ed70cab3caecba0ce695573c95284bb8da05209dcb"} Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.213714 4745 scope.go:117] "RemoveContainer" containerID="1c87498eb44870edf59801bc80b522ce3bb267b49db7132fd4789168aec06590" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.213802 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b4k5t" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.230920 4745 scope.go:117] "RemoveContainer" containerID="15f5e89c1ba13e214053368c0ff6d8a34ba1c4b093ba5954d3bfe1846ec089c5" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.246373 4745 scope.go:117] "RemoveContainer" containerID="fc0f799c5256fe74128012a46e336885dd8a92ab79e5f1fc62360011e6a71818" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.270545 4745 scope.go:117] "RemoveContainer" containerID="1c87498eb44870edf59801bc80b522ce3bb267b49db7132fd4789168aec06590" Dec 09 12:24:02 crc kubenswrapper[4745]: E1209 12:24:02.271063 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c87498eb44870edf59801bc80b522ce3bb267b49db7132fd4789168aec06590\": container with ID starting with 1c87498eb44870edf59801bc80b522ce3bb267b49db7132fd4789168aec06590 not found: ID does not exist" containerID="1c87498eb44870edf59801bc80b522ce3bb267b49db7132fd4789168aec06590" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.271116 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c87498eb44870edf59801bc80b522ce3bb267b49db7132fd4789168aec06590"} err="failed to get container status \"1c87498eb44870edf59801bc80b522ce3bb267b49db7132fd4789168aec06590\": rpc error: code = NotFound desc = could not find container \"1c87498eb44870edf59801bc80b522ce3bb267b49db7132fd4789168aec06590\": container with ID starting with 1c87498eb44870edf59801bc80b522ce3bb267b49db7132fd4789168aec06590 not found: ID does not exist" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.271156 4745 scope.go:117] "RemoveContainer" containerID="15f5e89c1ba13e214053368c0ff6d8a34ba1c4b093ba5954d3bfe1846ec089c5" Dec 09 12:24:02 crc kubenswrapper[4745]: E1209 12:24:02.271642 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15f5e89c1ba13e214053368c0ff6d8a34ba1c4b093ba5954d3bfe1846ec089c5\": container with ID starting with 15f5e89c1ba13e214053368c0ff6d8a34ba1c4b093ba5954d3bfe1846ec089c5 not found: ID does not exist" containerID="15f5e89c1ba13e214053368c0ff6d8a34ba1c4b093ba5954d3bfe1846ec089c5" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.271679 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15f5e89c1ba13e214053368c0ff6d8a34ba1c4b093ba5954d3bfe1846ec089c5"} err="failed to get container status \"15f5e89c1ba13e214053368c0ff6d8a34ba1c4b093ba5954d3bfe1846ec089c5\": rpc error: code = NotFound desc = could not find container \"15f5e89c1ba13e214053368c0ff6d8a34ba1c4b093ba5954d3bfe1846ec089c5\": container with ID starting with 15f5e89c1ba13e214053368c0ff6d8a34ba1c4b093ba5954d3bfe1846ec089c5 not found: ID does not exist" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.271702 4745 scope.go:117] "RemoveContainer" containerID="fc0f799c5256fe74128012a46e336885dd8a92ab79e5f1fc62360011e6a71818" Dec 09 12:24:02 crc kubenswrapper[4745]: E1209 12:24:02.272000 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc0f799c5256fe74128012a46e336885dd8a92ab79e5f1fc62360011e6a71818\": container with ID starting with fc0f799c5256fe74128012a46e336885dd8a92ab79e5f1fc62360011e6a71818 not found: ID does not exist" containerID="fc0f799c5256fe74128012a46e336885dd8a92ab79e5f1fc62360011e6a71818" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.272027 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc0f799c5256fe74128012a46e336885dd8a92ab79e5f1fc62360011e6a71818"} err="failed to get container status \"fc0f799c5256fe74128012a46e336885dd8a92ab79e5f1fc62360011e6a71818\": rpc error: code = NotFound desc = could not find container \"fc0f799c5256fe74128012a46e336885dd8a92ab79e5f1fc62360011e6a71818\": container with ID starting with fc0f799c5256fe74128012a46e336885dd8a92ab79e5f1fc62360011e6a71818 not found: ID does not exist" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.282680 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3361ce60-925b-41ff-8582-3156c403eaaf-catalog-content\") pod \"3361ce60-925b-41ff-8582-3156c403eaaf\" (UID: \"3361ce60-925b-41ff-8582-3156c403eaaf\") " Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.282912 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shnvt\" (UniqueName: \"kubernetes.io/projected/3361ce60-925b-41ff-8582-3156c403eaaf-kube-api-access-shnvt\") pod \"3361ce60-925b-41ff-8582-3156c403eaaf\" (UID: \"3361ce60-925b-41ff-8582-3156c403eaaf\") " Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.283007 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3361ce60-925b-41ff-8582-3156c403eaaf-utilities\") pod \"3361ce60-925b-41ff-8582-3156c403eaaf\" (UID: \"3361ce60-925b-41ff-8582-3156c403eaaf\") " Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.284623 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3361ce60-925b-41ff-8582-3156c403eaaf-utilities" (OuterVolumeSpecName: "utilities") pod "3361ce60-925b-41ff-8582-3156c403eaaf" (UID: "3361ce60-925b-41ff-8582-3156c403eaaf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.289926 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3361ce60-925b-41ff-8582-3156c403eaaf-kube-api-access-shnvt" (OuterVolumeSpecName: "kube-api-access-shnvt") pod "3361ce60-925b-41ff-8582-3156c403eaaf" (UID: "3361ce60-925b-41ff-8582-3156c403eaaf"). InnerVolumeSpecName "kube-api-access-shnvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.302390 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3361ce60-925b-41ff-8582-3156c403eaaf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3361ce60-925b-41ff-8582-3156c403eaaf" (UID: "3361ce60-925b-41ff-8582-3156c403eaaf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.384935 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shnvt\" (UniqueName: \"kubernetes.io/projected/3361ce60-925b-41ff-8582-3156c403eaaf-kube-api-access-shnvt\") on node \"crc\" DevicePath \"\"" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.384987 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3361ce60-925b-41ff-8582-3156c403eaaf-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.384998 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3361ce60-925b-41ff-8582-3156c403eaaf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.559081 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b4k5t"] Dec 09 12:24:02 crc kubenswrapper[4745]: I1209 12:24:02.565076 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b4k5t"] Dec 09 12:24:03 crc kubenswrapper[4745]: I1209 12:24:03.577887 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3361ce60-925b-41ff-8582-3156c403eaaf" path="/var/lib/kubelet/pods/3361ce60-925b-41ff-8582-3156c403eaaf/volumes" Dec 09 12:24:05 crc kubenswrapper[4745]: I1209 12:24:05.555623 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:24:05 crc kubenswrapper[4745]: E1209 12:24:05.556018 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:24:19 crc kubenswrapper[4745]: I1209 12:24:19.555569 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:24:19 crc kubenswrapper[4745]: E1209 12:24:19.556446 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:24:34 crc kubenswrapper[4745]: I1209 12:24:34.554935 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:24:34 crc kubenswrapper[4745]: E1209 12:24:34.555695 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:24:46 crc kubenswrapper[4745]: I1209 12:24:46.554926 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:24:46 crc kubenswrapper[4745]: E1209 12:24:46.555710 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:25:00 crc kubenswrapper[4745]: I1209 12:25:00.555751 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:25:00 crc kubenswrapper[4745]: E1209 12:25:00.556373 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:25:12 crc kubenswrapper[4745]: I1209 12:25:12.554370 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:25:12 crc kubenswrapper[4745]: E1209 12:25:12.554825 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:25:27 crc kubenswrapper[4745]: I1209 12:25:27.555918 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:25:28 crc kubenswrapper[4745]: I1209 12:25:28.848833 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"8758583a7d7f5e423ff89d3afe5ee82a5df56f0e405db6c36462d9b3f5716aa2"} Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.606778 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zjbwv"] Dec 09 12:27:41 crc kubenswrapper[4745]: E1209 12:27:41.608701 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3361ce60-925b-41ff-8582-3156c403eaaf" containerName="extract-content" Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.608808 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="3361ce60-925b-41ff-8582-3156c403eaaf" containerName="extract-content" Dec 09 12:27:41 crc kubenswrapper[4745]: E1209 12:27:41.608908 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3361ce60-925b-41ff-8582-3156c403eaaf" containerName="registry-server" Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.608982 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="3361ce60-925b-41ff-8582-3156c403eaaf" containerName="registry-server" Dec 09 12:27:41 crc kubenswrapper[4745]: E1209 12:27:41.609072 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3361ce60-925b-41ff-8582-3156c403eaaf" containerName="extract-utilities" Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.609135 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="3361ce60-925b-41ff-8582-3156c403eaaf" containerName="extract-utilities" Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.609365 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="3361ce60-925b-41ff-8582-3156c403eaaf" containerName="registry-server" Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.610697 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.611488 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjbwv"] Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.799690 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-utilities\") pod \"redhat-operators-zjbwv\" (UID: \"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82\") " pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.800265 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srrzt\" (UniqueName: \"kubernetes.io/projected/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-kube-api-access-srrzt\") pod \"redhat-operators-zjbwv\" (UID: \"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82\") " pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.800379 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-catalog-content\") pod \"redhat-operators-zjbwv\" (UID: \"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82\") " pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.901788 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-utilities\") pod \"redhat-operators-zjbwv\" (UID: \"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82\") " pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.901880 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srrzt\" (UniqueName: \"kubernetes.io/projected/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-kube-api-access-srrzt\") pod \"redhat-operators-zjbwv\" (UID: \"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82\") " pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.901922 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-catalog-content\") pod \"redhat-operators-zjbwv\" (UID: \"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82\") " pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.902384 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-utilities\") pod \"redhat-operators-zjbwv\" (UID: \"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82\") " pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.902495 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-catalog-content\") pod \"redhat-operators-zjbwv\" (UID: \"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82\") " pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.931166 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srrzt\" (UniqueName: \"kubernetes.io/projected/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-kube-api-access-srrzt\") pod \"redhat-operators-zjbwv\" (UID: \"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82\") " pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:41 crc kubenswrapper[4745]: I1209 12:27:41.931833 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:42 crc kubenswrapper[4745]: I1209 12:27:42.402129 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjbwv"] Dec 09 12:27:42 crc kubenswrapper[4745]: I1209 12:27:42.840046 4745 generic.go:334] "Generic (PLEG): container finished" podID="f675aba3-1c8b-44f6-b4b5-a3ced6e29c82" containerID="3e0b04dbc6b15c3a3d9abe3d1f90e57029eb05a2b8027a268e3bf52fb7f1bef2" exitCode=0 Dec 09 12:27:42 crc kubenswrapper[4745]: I1209 12:27:42.840142 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjbwv" event={"ID":"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82","Type":"ContainerDied","Data":"3e0b04dbc6b15c3a3d9abe3d1f90e57029eb05a2b8027a268e3bf52fb7f1bef2"} Dec 09 12:27:42 crc kubenswrapper[4745]: I1209 12:27:42.840335 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjbwv" event={"ID":"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82","Type":"ContainerStarted","Data":"59afe000f383e3f682a3b7168e79bb1197be0cebd581f2194ec4dbd6eaba0215"} Dec 09 12:27:43 crc kubenswrapper[4745]: I1209 12:27:43.849959 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjbwv" event={"ID":"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82","Type":"ContainerStarted","Data":"6d405845b82c03be76a8f3414dddf61a5c46b48ddc9c0c1c3e1092971f053d08"} Dec 09 12:27:44 crc kubenswrapper[4745]: I1209 12:27:44.859334 4745 generic.go:334] "Generic (PLEG): container finished" podID="f675aba3-1c8b-44f6-b4b5-a3ced6e29c82" containerID="6d405845b82c03be76a8f3414dddf61a5c46b48ddc9c0c1c3e1092971f053d08" exitCode=0 Dec 09 12:27:44 crc kubenswrapper[4745]: I1209 12:27:44.859381 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjbwv" event={"ID":"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82","Type":"ContainerDied","Data":"6d405845b82c03be76a8f3414dddf61a5c46b48ddc9c0c1c3e1092971f053d08"} Dec 09 12:27:45 crc kubenswrapper[4745]: I1209 12:27:45.869443 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjbwv" event={"ID":"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82","Type":"ContainerStarted","Data":"98ce59a5be5c01bf69a817eb2c32890b79af43edc14ac8cc9f24de029eccbd9b"} Dec 09 12:27:45 crc kubenswrapper[4745]: I1209 12:27:45.888579 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zjbwv" podStartSLOduration=2.456952626 podStartE2EDuration="4.888553053s" podCreationTimestamp="2025-12-09 12:27:41 +0000 UTC" firstStartedPulling="2025-12-09 12:27:42.841581083 +0000 UTC m=+3349.666782607" lastFinishedPulling="2025-12-09 12:27:45.27318151 +0000 UTC m=+3352.098383034" observedRunningTime="2025-12-09 12:27:45.886379134 +0000 UTC m=+3352.711580658" watchObservedRunningTime="2025-12-09 12:27:45.888553053 +0000 UTC m=+3352.713754587" Dec 09 12:27:51 crc kubenswrapper[4745]: I1209 12:27:51.932597 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:51 crc kubenswrapper[4745]: I1209 12:27:51.933112 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:51 crc kubenswrapper[4745]: I1209 12:27:51.976323 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:52 crc kubenswrapper[4745]: I1209 12:27:52.987086 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:53 crc kubenswrapper[4745]: I1209 12:27:53.039373 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zjbwv"] Dec 09 12:27:54 crc kubenswrapper[4745]: I1209 12:27:54.948342 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zjbwv" podUID="f675aba3-1c8b-44f6-b4b5-a3ced6e29c82" containerName="registry-server" containerID="cri-o://98ce59a5be5c01bf69a817eb2c32890b79af43edc14ac8cc9f24de029eccbd9b" gracePeriod=2 Dec 09 12:27:57 crc kubenswrapper[4745]: I1209 12:27:55.475569 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:27:57 crc kubenswrapper[4745]: I1209 12:27:55.475940 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:27:57 crc kubenswrapper[4745]: I1209 12:27:57.975532 4745 generic.go:334] "Generic (PLEG): container finished" podID="f675aba3-1c8b-44f6-b4b5-a3ced6e29c82" containerID="98ce59a5be5c01bf69a817eb2c32890b79af43edc14ac8cc9f24de029eccbd9b" exitCode=0 Dec 09 12:27:57 crc kubenswrapper[4745]: I1209 12:27:57.975622 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjbwv" event={"ID":"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82","Type":"ContainerDied","Data":"98ce59a5be5c01bf69a817eb2c32890b79af43edc14ac8cc9f24de029eccbd9b"} Dec 09 12:27:57 crc kubenswrapper[4745]: I1209 12:27:57.976195 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjbwv" event={"ID":"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82","Type":"ContainerDied","Data":"59afe000f383e3f682a3b7168e79bb1197be0cebd581f2194ec4dbd6eaba0215"} Dec 09 12:27:57 crc kubenswrapper[4745]: I1209 12:27:57.976218 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59afe000f383e3f682a3b7168e79bb1197be0cebd581f2194ec4dbd6eaba0215" Dec 09 12:27:58 crc kubenswrapper[4745]: I1209 12:27:58.015737 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:58 crc kubenswrapper[4745]: I1209 12:27:58.172947 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srrzt\" (UniqueName: \"kubernetes.io/projected/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-kube-api-access-srrzt\") pod \"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82\" (UID: \"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82\") " Dec 09 12:27:58 crc kubenswrapper[4745]: I1209 12:27:58.173012 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-utilities\") pod \"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82\" (UID: \"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82\") " Dec 09 12:27:58 crc kubenswrapper[4745]: I1209 12:27:58.173063 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-catalog-content\") pod \"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82\" (UID: \"f675aba3-1c8b-44f6-b4b5-a3ced6e29c82\") " Dec 09 12:27:58 crc kubenswrapper[4745]: I1209 12:27:58.174437 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-utilities" (OuterVolumeSpecName: "utilities") pod "f675aba3-1c8b-44f6-b4b5-a3ced6e29c82" (UID: "f675aba3-1c8b-44f6-b4b5-a3ced6e29c82"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:27:58 crc kubenswrapper[4745]: I1209 12:27:58.179592 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-kube-api-access-srrzt" (OuterVolumeSpecName: "kube-api-access-srrzt") pod "f675aba3-1c8b-44f6-b4b5-a3ced6e29c82" (UID: "f675aba3-1c8b-44f6-b4b5-a3ced6e29c82"). InnerVolumeSpecName "kube-api-access-srrzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:27:58 crc kubenswrapper[4745]: I1209 12:27:58.275840 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srrzt\" (UniqueName: \"kubernetes.io/projected/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-kube-api-access-srrzt\") on node \"crc\" DevicePath \"\"" Dec 09 12:27:58 crc kubenswrapper[4745]: I1209 12:27:58.275879 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:27:58 crc kubenswrapper[4745]: I1209 12:27:58.292383 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f675aba3-1c8b-44f6-b4b5-a3ced6e29c82" (UID: "f675aba3-1c8b-44f6-b4b5-a3ced6e29c82"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:27:58 crc kubenswrapper[4745]: I1209 12:27:58.377281 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:27:58 crc kubenswrapper[4745]: I1209 12:27:58.981735 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjbwv" Dec 09 12:27:59 crc kubenswrapper[4745]: I1209 12:27:59.012520 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zjbwv"] Dec 09 12:27:59 crc kubenswrapper[4745]: I1209 12:27:59.017790 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zjbwv"] Dec 09 12:27:59 crc kubenswrapper[4745]: I1209 12:27:59.564009 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f675aba3-1c8b-44f6-b4b5-a3ced6e29c82" path="/var/lib/kubelet/pods/f675aba3-1c8b-44f6-b4b5-a3ced6e29c82/volumes" Dec 09 12:28:25 crc kubenswrapper[4745]: I1209 12:28:25.477287 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:28:25 crc kubenswrapper[4745]: I1209 12:28:25.479200 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.036009 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cthhw"] Dec 09 12:28:41 crc kubenswrapper[4745]: E1209 12:28:41.036778 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f675aba3-1c8b-44f6-b4b5-a3ced6e29c82" containerName="registry-server" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.036791 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f675aba3-1c8b-44f6-b4b5-a3ced6e29c82" containerName="registry-server" Dec 09 12:28:41 crc kubenswrapper[4745]: E1209 12:28:41.036804 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f675aba3-1c8b-44f6-b4b5-a3ced6e29c82" containerName="extract-utilities" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.036810 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f675aba3-1c8b-44f6-b4b5-a3ced6e29c82" containerName="extract-utilities" Dec 09 12:28:41 crc kubenswrapper[4745]: E1209 12:28:41.036824 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f675aba3-1c8b-44f6-b4b5-a3ced6e29c82" containerName="extract-content" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.036831 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f675aba3-1c8b-44f6-b4b5-a3ced6e29c82" containerName="extract-content" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.036978 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f675aba3-1c8b-44f6-b4b5-a3ced6e29c82" containerName="registry-server" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.038030 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.047213 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cthhw"] Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.236884 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rjv8d"] Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.238881 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.250025 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rjv8d"] Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.358323 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdskr\" (UniqueName: \"kubernetes.io/projected/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-kube-api-access-xdskr\") pod \"community-operators-cthhw\" (UID: \"e920304a-929f-4bd4-b3bb-f9aeeb3f735b\") " pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.358402 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn4nn\" (UniqueName: \"kubernetes.io/projected/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-kube-api-access-sn4nn\") pod \"certified-operators-rjv8d\" (UID: \"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893\") " pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.358468 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-utilities\") pod \"certified-operators-rjv8d\" (UID: \"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893\") " pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.358565 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-catalog-content\") pod \"certified-operators-rjv8d\" (UID: \"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893\") " pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.358683 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-utilities\") pod \"community-operators-cthhw\" (UID: \"e920304a-929f-4bd4-b3bb-f9aeeb3f735b\") " pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.358711 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-catalog-content\") pod \"community-operators-cthhw\" (UID: \"e920304a-929f-4bd4-b3bb-f9aeeb3f735b\") " pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.459757 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-catalog-content\") pod \"certified-operators-rjv8d\" (UID: \"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893\") " pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.459869 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-utilities\") pod \"community-operators-cthhw\" (UID: \"e920304a-929f-4bd4-b3bb-f9aeeb3f735b\") " pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.459896 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-catalog-content\") pod \"community-operators-cthhw\" (UID: \"e920304a-929f-4bd4-b3bb-f9aeeb3f735b\") " pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.459949 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdskr\" (UniqueName: \"kubernetes.io/projected/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-kube-api-access-xdskr\") pod \"community-operators-cthhw\" (UID: \"e920304a-929f-4bd4-b3bb-f9aeeb3f735b\") " pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.459980 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn4nn\" (UniqueName: \"kubernetes.io/projected/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-kube-api-access-sn4nn\") pod \"certified-operators-rjv8d\" (UID: \"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893\") " pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.460012 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-utilities\") pod \"certified-operators-rjv8d\" (UID: \"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893\") " pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.460411 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-utilities\") pod \"community-operators-cthhw\" (UID: \"e920304a-929f-4bd4-b3bb-f9aeeb3f735b\") " pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.460548 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-utilities\") pod \"certified-operators-rjv8d\" (UID: \"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893\") " pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.460584 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-catalog-content\") pod \"certified-operators-rjv8d\" (UID: \"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893\") " pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.460751 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-catalog-content\") pod \"community-operators-cthhw\" (UID: \"e920304a-929f-4bd4-b3bb-f9aeeb3f735b\") " pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.483010 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdskr\" (UniqueName: \"kubernetes.io/projected/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-kube-api-access-xdskr\") pod \"community-operators-cthhw\" (UID: \"e920304a-929f-4bd4-b3bb-f9aeeb3f735b\") " pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.484152 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn4nn\" (UniqueName: \"kubernetes.io/projected/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-kube-api-access-sn4nn\") pod \"certified-operators-rjv8d\" (UID: \"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893\") " pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.558849 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:41 crc kubenswrapper[4745]: I1209 12:28:41.664219 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:42 crc kubenswrapper[4745]: I1209 12:28:42.037610 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cthhw"] Dec 09 12:28:42 crc kubenswrapper[4745]: W1209 12:28:42.046500 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode920304a_929f_4bd4_b3bb_f9aeeb3f735b.slice/crio-a2a4b5160c2c5362745479c7250f53c4ba33d5932dc1a18397a12fdf36e6e852 WatchSource:0}: Error finding container a2a4b5160c2c5362745479c7250f53c4ba33d5932dc1a18397a12fdf36e6e852: Status 404 returned error can't find the container with id a2a4b5160c2c5362745479c7250f53c4ba33d5932dc1a18397a12fdf36e6e852 Dec 09 12:28:42 crc kubenswrapper[4745]: I1209 12:28:42.137253 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rjv8d"] Dec 09 12:28:42 crc kubenswrapper[4745]: I1209 12:28:42.369944 4745 generic.go:334] "Generic (PLEG): container finished" podID="0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893" containerID="741a1a1eb58c4354a2b462b2b66bc83f3b3397f4520b2fa5dc35ad3bf9c1a962" exitCode=0 Dec 09 12:28:42 crc kubenswrapper[4745]: I1209 12:28:42.370071 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjv8d" event={"ID":"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893","Type":"ContainerDied","Data":"741a1a1eb58c4354a2b462b2b66bc83f3b3397f4520b2fa5dc35ad3bf9c1a962"} Dec 09 12:28:42 crc kubenswrapper[4745]: I1209 12:28:42.370118 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjv8d" event={"ID":"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893","Type":"ContainerStarted","Data":"05d6d3c6f3a263dfe67018101fa28464c9cec1538f411742bfa052bff1294730"} Dec 09 12:28:42 crc kubenswrapper[4745]: I1209 12:28:42.372689 4745 generic.go:334] "Generic (PLEG): container finished" podID="e920304a-929f-4bd4-b3bb-f9aeeb3f735b" containerID="8a21727829d24e0f03c8f168f88a022e2543cc2371652262b5500ec9b093f041" exitCode=0 Dec 09 12:28:42 crc kubenswrapper[4745]: I1209 12:28:42.372735 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cthhw" event={"ID":"e920304a-929f-4bd4-b3bb-f9aeeb3f735b","Type":"ContainerDied","Data":"8a21727829d24e0f03c8f168f88a022e2543cc2371652262b5500ec9b093f041"} Dec 09 12:28:42 crc kubenswrapper[4745]: I1209 12:28:42.373395 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cthhw" event={"ID":"e920304a-929f-4bd4-b3bb-f9aeeb3f735b","Type":"ContainerStarted","Data":"a2a4b5160c2c5362745479c7250f53c4ba33d5932dc1a18397a12fdf36e6e852"} Dec 09 12:28:43 crc kubenswrapper[4745]: I1209 12:28:43.383096 4745 generic.go:334] "Generic (PLEG): container finished" podID="0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893" containerID="332684567c7204823e5f54cfc6ea69529439b3d3a49ea08f5d947ffb70650e94" exitCode=0 Dec 09 12:28:43 crc kubenswrapper[4745]: I1209 12:28:43.383163 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjv8d" event={"ID":"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893","Type":"ContainerDied","Data":"332684567c7204823e5f54cfc6ea69529439b3d3a49ea08f5d947ffb70650e94"} Dec 09 12:28:43 crc kubenswrapper[4745]: I1209 12:28:43.388343 4745 generic.go:334] "Generic (PLEG): container finished" podID="e920304a-929f-4bd4-b3bb-f9aeeb3f735b" containerID="8bb08456b3301e54a8dab7e7708cee909965e7bf69274f55bc8ca8b5c5445a8b" exitCode=0 Dec 09 12:28:43 crc kubenswrapper[4745]: I1209 12:28:43.388376 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cthhw" event={"ID":"e920304a-929f-4bd4-b3bb-f9aeeb3f735b","Type":"ContainerDied","Data":"8bb08456b3301e54a8dab7e7708cee909965e7bf69274f55bc8ca8b5c5445a8b"} Dec 09 12:28:44 crc kubenswrapper[4745]: I1209 12:28:44.397336 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjv8d" event={"ID":"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893","Type":"ContainerStarted","Data":"d57b5c9d98341c70e3a7966a374d019dd0e3a9e52db56ff58c1163a6a4d646d1"} Dec 09 12:28:44 crc kubenswrapper[4745]: I1209 12:28:44.400792 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cthhw" event={"ID":"e920304a-929f-4bd4-b3bb-f9aeeb3f735b","Type":"ContainerStarted","Data":"3a1e61399ca66d7095372b44545e3a8486c58877f9ed6e5f99240aa6b45d9703"} Dec 09 12:28:44 crc kubenswrapper[4745]: I1209 12:28:44.423049 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rjv8d" podStartSLOduration=1.79568673 podStartE2EDuration="3.423025664s" podCreationTimestamp="2025-12-09 12:28:41 +0000 UTC" firstStartedPulling="2025-12-09 12:28:42.372083946 +0000 UTC m=+3409.197285470" lastFinishedPulling="2025-12-09 12:28:43.99942289 +0000 UTC m=+3410.824624404" observedRunningTime="2025-12-09 12:28:44.419171061 +0000 UTC m=+3411.244372585" watchObservedRunningTime="2025-12-09 12:28:44.423025664 +0000 UTC m=+3411.248227188" Dec 09 12:28:44 crc kubenswrapper[4745]: I1209 12:28:44.447323 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cthhw" podStartSLOduration=1.94412743 podStartE2EDuration="3.447294778s" podCreationTimestamp="2025-12-09 12:28:41 +0000 UTC" firstStartedPulling="2025-12-09 12:28:42.373867884 +0000 UTC m=+3409.199069408" lastFinishedPulling="2025-12-09 12:28:43.877035242 +0000 UTC m=+3410.702236756" observedRunningTime="2025-12-09 12:28:44.441934424 +0000 UTC m=+3411.267135948" watchObservedRunningTime="2025-12-09 12:28:44.447294778 +0000 UTC m=+3411.272496302" Dec 09 12:28:51 crc kubenswrapper[4745]: I1209 12:28:51.563725 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:51 crc kubenswrapper[4745]: I1209 12:28:51.564337 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:51 crc kubenswrapper[4745]: I1209 12:28:51.602275 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:51 crc kubenswrapper[4745]: I1209 12:28:51.665303 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:51 crc kubenswrapper[4745]: I1209 12:28:51.665373 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:51 crc kubenswrapper[4745]: I1209 12:28:51.707003 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:52 crc kubenswrapper[4745]: I1209 12:28:52.509188 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:52 crc kubenswrapper[4745]: I1209 12:28:52.513899 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:55 crc kubenswrapper[4745]: I1209 12:28:55.020429 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rjv8d"] Dec 09 12:28:55 crc kubenswrapper[4745]: I1209 12:28:55.021035 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rjv8d" podUID="0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893" containerName="registry-server" containerID="cri-o://d57b5c9d98341c70e3a7966a374d019dd0e3a9e52db56ff58c1163a6a4d646d1" gracePeriod=2 Dec 09 12:28:55 crc kubenswrapper[4745]: I1209 12:28:55.221652 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cthhw"] Dec 09 12:28:55 crc kubenswrapper[4745]: I1209 12:28:55.221978 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cthhw" podUID="e920304a-929f-4bd4-b3bb-f9aeeb3f735b" containerName="registry-server" containerID="cri-o://3a1e61399ca66d7095372b44545e3a8486c58877f9ed6e5f99240aa6b45d9703" gracePeriod=2 Dec 09 12:28:55 crc kubenswrapper[4745]: I1209 12:28:55.475210 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:28:55 crc kubenswrapper[4745]: I1209 12:28:55.475283 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:28:55 crc kubenswrapper[4745]: I1209 12:28:55.475336 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 12:28:55 crc kubenswrapper[4745]: I1209 12:28:55.476071 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8758583a7d7f5e423ff89d3afe5ee82a5df56f0e405db6c36462d9b3f5716aa2"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:28:55 crc kubenswrapper[4745]: I1209 12:28:55.476141 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://8758583a7d7f5e423ff89d3afe5ee82a5df56f0e405db6c36462d9b3f5716aa2" gracePeriod=600 Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.099910 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.203431 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-utilities\") pod \"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893\" (UID: \"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893\") " Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.203584 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn4nn\" (UniqueName: \"kubernetes.io/projected/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-kube-api-access-sn4nn\") pod \"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893\" (UID: \"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893\") " Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.203624 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-catalog-content\") pod \"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893\" (UID: \"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893\") " Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.205694 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-utilities" (OuterVolumeSpecName: "utilities") pod "0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893" (UID: "0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.209897 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-kube-api-access-sn4nn" (OuterVolumeSpecName: "kube-api-access-sn4nn") pod "0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893" (UID: "0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893"). InnerVolumeSpecName "kube-api-access-sn4nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.266391 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893" (UID: "0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.305997 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn4nn\" (UniqueName: \"kubernetes.io/projected/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-kube-api-access-sn4nn\") on node \"crc\" DevicePath \"\"" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.306029 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.306055 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.359904 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.509101 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-catalog-content\") pod \"e920304a-929f-4bd4-b3bb-f9aeeb3f735b\" (UID: \"e920304a-929f-4bd4-b3bb-f9aeeb3f735b\") " Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.509411 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdskr\" (UniqueName: \"kubernetes.io/projected/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-kube-api-access-xdskr\") pod \"e920304a-929f-4bd4-b3bb-f9aeeb3f735b\" (UID: \"e920304a-929f-4bd4-b3bb-f9aeeb3f735b\") " Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.509632 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-utilities\") pod \"e920304a-929f-4bd4-b3bb-f9aeeb3f735b\" (UID: \"e920304a-929f-4bd4-b3bb-f9aeeb3f735b\") " Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.510987 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-utilities" (OuterVolumeSpecName: "utilities") pod "e920304a-929f-4bd4-b3bb-f9aeeb3f735b" (UID: "e920304a-929f-4bd4-b3bb-f9aeeb3f735b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.512718 4745 generic.go:334] "Generic (PLEG): container finished" podID="e920304a-929f-4bd4-b3bb-f9aeeb3f735b" containerID="3a1e61399ca66d7095372b44545e3a8486c58877f9ed6e5f99240aa6b45d9703" exitCode=0 Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.512811 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cthhw" event={"ID":"e920304a-929f-4bd4-b3bb-f9aeeb3f735b","Type":"ContainerDied","Data":"3a1e61399ca66d7095372b44545e3a8486c58877f9ed6e5f99240aa6b45d9703"} Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.512845 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cthhw" event={"ID":"e920304a-929f-4bd4-b3bb-f9aeeb3f735b","Type":"ContainerDied","Data":"a2a4b5160c2c5362745479c7250f53c4ba33d5932dc1a18397a12fdf36e6e852"} Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.512849 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cthhw" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.512894 4745 scope.go:117] "RemoveContainer" containerID="3a1e61399ca66d7095372b44545e3a8486c58877f9ed6e5f99240aa6b45d9703" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.517312 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-kube-api-access-xdskr" (OuterVolumeSpecName: "kube-api-access-xdskr") pod "e920304a-929f-4bd4-b3bb-f9aeeb3f735b" (UID: "e920304a-929f-4bd4-b3bb-f9aeeb3f735b"). InnerVolumeSpecName "kube-api-access-xdskr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.519938 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="8758583a7d7f5e423ff89d3afe5ee82a5df56f0e405db6c36462d9b3f5716aa2" exitCode=0 Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.520021 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"8758583a7d7f5e423ff89d3afe5ee82a5df56f0e405db6c36462d9b3f5716aa2"} Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.520088 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e"} Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.525125 4745 generic.go:334] "Generic (PLEG): container finished" podID="0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893" containerID="d57b5c9d98341c70e3a7966a374d019dd0e3a9e52db56ff58c1163a6a4d646d1" exitCode=0 Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.525179 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjv8d" event={"ID":"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893","Type":"ContainerDied","Data":"d57b5c9d98341c70e3a7966a374d019dd0e3a9e52db56ff58c1163a6a4d646d1"} Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.525210 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjv8d" event={"ID":"0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893","Type":"ContainerDied","Data":"05d6d3c6f3a263dfe67018101fa28464c9cec1538f411742bfa052bff1294730"} Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.525273 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjv8d" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.546533 4745 scope.go:117] "RemoveContainer" containerID="8bb08456b3301e54a8dab7e7708cee909965e7bf69274f55bc8ca8b5c5445a8b" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.569108 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e920304a-929f-4bd4-b3bb-f9aeeb3f735b" (UID: "e920304a-929f-4bd4-b3bb-f9aeeb3f735b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.577181 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rjv8d"] Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.583405 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rjv8d"] Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.599553 4745 scope.go:117] "RemoveContainer" containerID="8a21727829d24e0f03c8f168f88a022e2543cc2371652262b5500ec9b093f041" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.611348 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.611386 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdskr\" (UniqueName: \"kubernetes.io/projected/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-kube-api-access-xdskr\") on node \"crc\" DevicePath \"\"" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.611396 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e920304a-929f-4bd4-b3bb-f9aeeb3f735b-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.621948 4745 scope.go:117] "RemoveContainer" containerID="3a1e61399ca66d7095372b44545e3a8486c58877f9ed6e5f99240aa6b45d9703" Dec 09 12:28:58 crc kubenswrapper[4745]: E1209 12:28:58.622556 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a1e61399ca66d7095372b44545e3a8486c58877f9ed6e5f99240aa6b45d9703\": container with ID starting with 3a1e61399ca66d7095372b44545e3a8486c58877f9ed6e5f99240aa6b45d9703 not found: ID does not exist" containerID="3a1e61399ca66d7095372b44545e3a8486c58877f9ed6e5f99240aa6b45d9703" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.622591 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1e61399ca66d7095372b44545e3a8486c58877f9ed6e5f99240aa6b45d9703"} err="failed to get container status \"3a1e61399ca66d7095372b44545e3a8486c58877f9ed6e5f99240aa6b45d9703\": rpc error: code = NotFound desc = could not find container \"3a1e61399ca66d7095372b44545e3a8486c58877f9ed6e5f99240aa6b45d9703\": container with ID starting with 3a1e61399ca66d7095372b44545e3a8486c58877f9ed6e5f99240aa6b45d9703 not found: ID does not exist" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.622614 4745 scope.go:117] "RemoveContainer" containerID="8bb08456b3301e54a8dab7e7708cee909965e7bf69274f55bc8ca8b5c5445a8b" Dec 09 12:28:58 crc kubenswrapper[4745]: E1209 12:28:58.623293 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bb08456b3301e54a8dab7e7708cee909965e7bf69274f55bc8ca8b5c5445a8b\": container with ID starting with 8bb08456b3301e54a8dab7e7708cee909965e7bf69274f55bc8ca8b5c5445a8b not found: ID does not exist" containerID="8bb08456b3301e54a8dab7e7708cee909965e7bf69274f55bc8ca8b5c5445a8b" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.623353 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb08456b3301e54a8dab7e7708cee909965e7bf69274f55bc8ca8b5c5445a8b"} err="failed to get container status \"8bb08456b3301e54a8dab7e7708cee909965e7bf69274f55bc8ca8b5c5445a8b\": rpc error: code = NotFound desc = could not find container \"8bb08456b3301e54a8dab7e7708cee909965e7bf69274f55bc8ca8b5c5445a8b\": container with ID starting with 8bb08456b3301e54a8dab7e7708cee909965e7bf69274f55bc8ca8b5c5445a8b not found: ID does not exist" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.623389 4745 scope.go:117] "RemoveContainer" containerID="8a21727829d24e0f03c8f168f88a022e2543cc2371652262b5500ec9b093f041" Dec 09 12:28:58 crc kubenswrapper[4745]: E1209 12:28:58.623789 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a21727829d24e0f03c8f168f88a022e2543cc2371652262b5500ec9b093f041\": container with ID starting with 8a21727829d24e0f03c8f168f88a022e2543cc2371652262b5500ec9b093f041 not found: ID does not exist" containerID="8a21727829d24e0f03c8f168f88a022e2543cc2371652262b5500ec9b093f041" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.623815 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a21727829d24e0f03c8f168f88a022e2543cc2371652262b5500ec9b093f041"} err="failed to get container status \"8a21727829d24e0f03c8f168f88a022e2543cc2371652262b5500ec9b093f041\": rpc error: code = NotFound desc = could not find container \"8a21727829d24e0f03c8f168f88a022e2543cc2371652262b5500ec9b093f041\": container with ID starting with 8a21727829d24e0f03c8f168f88a022e2543cc2371652262b5500ec9b093f041 not found: ID does not exist" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.623836 4745 scope.go:117] "RemoveContainer" containerID="4ebc9fa739407a133560115d94bcb3a7da88322233b9f138e0f0fa476b497fc1" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.662502 4745 scope.go:117] "RemoveContainer" containerID="d57b5c9d98341c70e3a7966a374d019dd0e3a9e52db56ff58c1163a6a4d646d1" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.682459 4745 scope.go:117] "RemoveContainer" containerID="332684567c7204823e5f54cfc6ea69529439b3d3a49ea08f5d947ffb70650e94" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.703203 4745 scope.go:117] "RemoveContainer" containerID="741a1a1eb58c4354a2b462b2b66bc83f3b3397f4520b2fa5dc35ad3bf9c1a962" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.727427 4745 scope.go:117] "RemoveContainer" containerID="d57b5c9d98341c70e3a7966a374d019dd0e3a9e52db56ff58c1163a6a4d646d1" Dec 09 12:28:58 crc kubenswrapper[4745]: E1209 12:28:58.728092 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d57b5c9d98341c70e3a7966a374d019dd0e3a9e52db56ff58c1163a6a4d646d1\": container with ID starting with d57b5c9d98341c70e3a7966a374d019dd0e3a9e52db56ff58c1163a6a4d646d1 not found: ID does not exist" containerID="d57b5c9d98341c70e3a7966a374d019dd0e3a9e52db56ff58c1163a6a4d646d1" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.728155 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d57b5c9d98341c70e3a7966a374d019dd0e3a9e52db56ff58c1163a6a4d646d1"} err="failed to get container status \"d57b5c9d98341c70e3a7966a374d019dd0e3a9e52db56ff58c1163a6a4d646d1\": rpc error: code = NotFound desc = could not find container \"d57b5c9d98341c70e3a7966a374d019dd0e3a9e52db56ff58c1163a6a4d646d1\": container with ID starting with d57b5c9d98341c70e3a7966a374d019dd0e3a9e52db56ff58c1163a6a4d646d1 not found: ID does not exist" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.728196 4745 scope.go:117] "RemoveContainer" containerID="332684567c7204823e5f54cfc6ea69529439b3d3a49ea08f5d947ffb70650e94" Dec 09 12:28:58 crc kubenswrapper[4745]: E1209 12:28:58.728691 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"332684567c7204823e5f54cfc6ea69529439b3d3a49ea08f5d947ffb70650e94\": container with ID starting with 332684567c7204823e5f54cfc6ea69529439b3d3a49ea08f5d947ffb70650e94 not found: ID does not exist" containerID="332684567c7204823e5f54cfc6ea69529439b3d3a49ea08f5d947ffb70650e94" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.728724 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"332684567c7204823e5f54cfc6ea69529439b3d3a49ea08f5d947ffb70650e94"} err="failed to get container status \"332684567c7204823e5f54cfc6ea69529439b3d3a49ea08f5d947ffb70650e94\": rpc error: code = NotFound desc = could not find container \"332684567c7204823e5f54cfc6ea69529439b3d3a49ea08f5d947ffb70650e94\": container with ID starting with 332684567c7204823e5f54cfc6ea69529439b3d3a49ea08f5d947ffb70650e94 not found: ID does not exist" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.728745 4745 scope.go:117] "RemoveContainer" containerID="741a1a1eb58c4354a2b462b2b66bc83f3b3397f4520b2fa5dc35ad3bf9c1a962" Dec 09 12:28:58 crc kubenswrapper[4745]: E1209 12:28:58.728963 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"741a1a1eb58c4354a2b462b2b66bc83f3b3397f4520b2fa5dc35ad3bf9c1a962\": container with ID starting with 741a1a1eb58c4354a2b462b2b66bc83f3b3397f4520b2fa5dc35ad3bf9c1a962 not found: ID does not exist" containerID="741a1a1eb58c4354a2b462b2b66bc83f3b3397f4520b2fa5dc35ad3bf9c1a962" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.728999 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"741a1a1eb58c4354a2b462b2b66bc83f3b3397f4520b2fa5dc35ad3bf9c1a962"} err="failed to get container status \"741a1a1eb58c4354a2b462b2b66bc83f3b3397f4520b2fa5dc35ad3bf9c1a962\": rpc error: code = NotFound desc = could not find container \"741a1a1eb58c4354a2b462b2b66bc83f3b3397f4520b2fa5dc35ad3bf9c1a962\": container with ID starting with 741a1a1eb58c4354a2b462b2b66bc83f3b3397f4520b2fa5dc35ad3bf9c1a962 not found: ID does not exist" Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.852006 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cthhw"] Dec 09 12:28:58 crc kubenswrapper[4745]: I1209 12:28:58.858424 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cthhw"] Dec 09 12:28:59 crc kubenswrapper[4745]: I1209 12:28:59.564666 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893" path="/var/lib/kubelet/pods/0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893/volumes" Dec 09 12:28:59 crc kubenswrapper[4745]: I1209 12:28:59.565800 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e920304a-929f-4bd4-b3bb-f9aeeb3f735b" path="/var/lib/kubelet/pods/e920304a-929f-4bd4-b3bb-f9aeeb3f735b/volumes" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.146762 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl"] Dec 09 12:30:00 crc kubenswrapper[4745]: E1209 12:30:00.147711 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e920304a-929f-4bd4-b3bb-f9aeeb3f735b" containerName="extract-content" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.147729 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e920304a-929f-4bd4-b3bb-f9aeeb3f735b" containerName="extract-content" Dec 09 12:30:00 crc kubenswrapper[4745]: E1209 12:30:00.147751 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893" containerName="extract-utilities" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.147759 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893" containerName="extract-utilities" Dec 09 12:30:00 crc kubenswrapper[4745]: E1209 12:30:00.147780 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893" containerName="extract-content" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.147787 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893" containerName="extract-content" Dec 09 12:30:00 crc kubenswrapper[4745]: E1209 12:30:00.147803 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e920304a-929f-4bd4-b3bb-f9aeeb3f735b" containerName="extract-utilities" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.147811 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e920304a-929f-4bd4-b3bb-f9aeeb3f735b" containerName="extract-utilities" Dec 09 12:30:00 crc kubenswrapper[4745]: E1209 12:30:00.147824 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893" containerName="registry-server" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.147832 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893" containerName="registry-server" Dec 09 12:30:00 crc kubenswrapper[4745]: E1209 12:30:00.147841 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e920304a-929f-4bd4-b3bb-f9aeeb3f735b" containerName="registry-server" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.147848 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e920304a-929f-4bd4-b3bb-f9aeeb3f735b" containerName="registry-server" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.148007 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e920304a-929f-4bd4-b3bb-f9aeeb3f735b" containerName="registry-server" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.148029 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e34e30a-3dd9-44a3-8f83-5f1a7d3fb893" containerName="registry-server" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.148649 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.151391 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.157166 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.157784 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl"] Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.240379 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftzcs\" (UniqueName: \"kubernetes.io/projected/a0197cce-ee10-4ff4-b41b-759decbf50db-kube-api-access-ftzcs\") pod \"collect-profiles-29421390-cggfl\" (UID: \"a0197cce-ee10-4ff4-b41b-759decbf50db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.240452 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0197cce-ee10-4ff4-b41b-759decbf50db-config-volume\") pod \"collect-profiles-29421390-cggfl\" (UID: \"a0197cce-ee10-4ff4-b41b-759decbf50db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.240491 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a0197cce-ee10-4ff4-b41b-759decbf50db-secret-volume\") pod \"collect-profiles-29421390-cggfl\" (UID: \"a0197cce-ee10-4ff4-b41b-759decbf50db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.341601 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftzcs\" (UniqueName: \"kubernetes.io/projected/a0197cce-ee10-4ff4-b41b-759decbf50db-kube-api-access-ftzcs\") pod \"collect-profiles-29421390-cggfl\" (UID: \"a0197cce-ee10-4ff4-b41b-759decbf50db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.341686 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0197cce-ee10-4ff4-b41b-759decbf50db-config-volume\") pod \"collect-profiles-29421390-cggfl\" (UID: \"a0197cce-ee10-4ff4-b41b-759decbf50db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.341731 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a0197cce-ee10-4ff4-b41b-759decbf50db-secret-volume\") pod \"collect-profiles-29421390-cggfl\" (UID: \"a0197cce-ee10-4ff4-b41b-759decbf50db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.343250 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0197cce-ee10-4ff4-b41b-759decbf50db-config-volume\") pod \"collect-profiles-29421390-cggfl\" (UID: \"a0197cce-ee10-4ff4-b41b-759decbf50db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.349928 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a0197cce-ee10-4ff4-b41b-759decbf50db-secret-volume\") pod \"collect-profiles-29421390-cggfl\" (UID: \"a0197cce-ee10-4ff4-b41b-759decbf50db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.362754 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftzcs\" (UniqueName: \"kubernetes.io/projected/a0197cce-ee10-4ff4-b41b-759decbf50db-kube-api-access-ftzcs\") pod \"collect-profiles-29421390-cggfl\" (UID: \"a0197cce-ee10-4ff4-b41b-759decbf50db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.473364 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl" Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.885584 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl"] Dec 09 12:30:00 crc kubenswrapper[4745]: I1209 12:30:00.997877 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl" event={"ID":"a0197cce-ee10-4ff4-b41b-759decbf50db","Type":"ContainerStarted","Data":"aabe946c4470f9d671e9477f18dc721b8a3654533990209df2cdb2d00744253a"} Dec 09 12:30:02 crc kubenswrapper[4745]: I1209 12:30:02.012335 4745 generic.go:334] "Generic (PLEG): container finished" podID="a0197cce-ee10-4ff4-b41b-759decbf50db" containerID="8d40aa3822b8ec9dca75003ba1f1795fad33b4f57e30de89a9c712fd04c3020d" exitCode=0 Dec 09 12:30:02 crc kubenswrapper[4745]: I1209 12:30:02.012944 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl" event={"ID":"a0197cce-ee10-4ff4-b41b-759decbf50db","Type":"ContainerDied","Data":"8d40aa3822b8ec9dca75003ba1f1795fad33b4f57e30de89a9c712fd04c3020d"} Dec 09 12:30:03 crc kubenswrapper[4745]: I1209 12:30:03.280735 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl" Dec 09 12:30:03 crc kubenswrapper[4745]: I1209 12:30:03.391656 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a0197cce-ee10-4ff4-b41b-759decbf50db-secret-volume\") pod \"a0197cce-ee10-4ff4-b41b-759decbf50db\" (UID: \"a0197cce-ee10-4ff4-b41b-759decbf50db\") " Dec 09 12:30:03 crc kubenswrapper[4745]: I1209 12:30:03.391809 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftzcs\" (UniqueName: \"kubernetes.io/projected/a0197cce-ee10-4ff4-b41b-759decbf50db-kube-api-access-ftzcs\") pod \"a0197cce-ee10-4ff4-b41b-759decbf50db\" (UID: \"a0197cce-ee10-4ff4-b41b-759decbf50db\") " Dec 09 12:30:03 crc kubenswrapper[4745]: I1209 12:30:03.391852 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0197cce-ee10-4ff4-b41b-759decbf50db-config-volume\") pod \"a0197cce-ee10-4ff4-b41b-759decbf50db\" (UID: \"a0197cce-ee10-4ff4-b41b-759decbf50db\") " Dec 09 12:30:03 crc kubenswrapper[4745]: I1209 12:30:03.392909 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0197cce-ee10-4ff4-b41b-759decbf50db-config-volume" (OuterVolumeSpecName: "config-volume") pod "a0197cce-ee10-4ff4-b41b-759decbf50db" (UID: "a0197cce-ee10-4ff4-b41b-759decbf50db"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 12:30:03 crc kubenswrapper[4745]: I1209 12:30:03.397196 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0197cce-ee10-4ff4-b41b-759decbf50db-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a0197cce-ee10-4ff4-b41b-759decbf50db" (UID: "a0197cce-ee10-4ff4-b41b-759decbf50db"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:30:03 crc kubenswrapper[4745]: I1209 12:30:03.397298 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0197cce-ee10-4ff4-b41b-759decbf50db-kube-api-access-ftzcs" (OuterVolumeSpecName: "kube-api-access-ftzcs") pod "a0197cce-ee10-4ff4-b41b-759decbf50db" (UID: "a0197cce-ee10-4ff4-b41b-759decbf50db"). InnerVolumeSpecName "kube-api-access-ftzcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:30:03 crc kubenswrapper[4745]: I1209 12:30:03.493677 4745 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a0197cce-ee10-4ff4-b41b-759decbf50db-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 12:30:03 crc kubenswrapper[4745]: I1209 12:30:03.493731 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftzcs\" (UniqueName: \"kubernetes.io/projected/a0197cce-ee10-4ff4-b41b-759decbf50db-kube-api-access-ftzcs\") on node \"crc\" DevicePath \"\"" Dec 09 12:30:03 crc kubenswrapper[4745]: I1209 12:30:03.493745 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0197cce-ee10-4ff4-b41b-759decbf50db-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 12:30:04 crc kubenswrapper[4745]: I1209 12:30:04.029649 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl" event={"ID":"a0197cce-ee10-4ff4-b41b-759decbf50db","Type":"ContainerDied","Data":"aabe946c4470f9d671e9477f18dc721b8a3654533990209df2cdb2d00744253a"} Dec 09 12:30:04 crc kubenswrapper[4745]: I1209 12:30:04.029692 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aabe946c4470f9d671e9477f18dc721b8a3654533990209df2cdb2d00744253a" Dec 09 12:30:04 crc kubenswrapper[4745]: I1209 12:30:04.029709 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421390-cggfl" Dec 09 12:30:04 crc kubenswrapper[4745]: I1209 12:30:04.351119 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb"] Dec 09 12:30:04 crc kubenswrapper[4745]: I1209 12:30:04.356908 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421345-fdcrb"] Dec 09 12:30:05 crc kubenswrapper[4745]: I1209 12:30:05.563656 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e6b0836-fccb-4178-94f4-99d6c2d93c1a" path="/var/lib/kubelet/pods/4e6b0836-fccb-4178-94f4-99d6c2d93c1a/volumes" Dec 09 12:30:05 crc kubenswrapper[4745]: I1209 12:30:05.742577 4745 scope.go:117] "RemoveContainer" containerID="d8e43ca97105c5e0d445175330bb609be6eba3fc05d3026582d9082087b979c9" Dec 09 12:31:25 crc kubenswrapper[4745]: I1209 12:31:25.475655 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:31:25 crc kubenswrapper[4745]: I1209 12:31:25.476322 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:31:55 crc kubenswrapper[4745]: I1209 12:31:55.475202 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:31:55 crc kubenswrapper[4745]: I1209 12:31:55.475821 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:32:25 crc kubenswrapper[4745]: I1209 12:32:25.477453 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:32:25 crc kubenswrapper[4745]: I1209 12:32:25.478280 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:32:25 crc kubenswrapper[4745]: I1209 12:32:25.478344 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 12:32:25 crc kubenswrapper[4745]: I1209 12:32:25.479079 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:32:25 crc kubenswrapper[4745]: I1209 12:32:25.479136 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" gracePeriod=600 Dec 09 12:32:25 crc kubenswrapper[4745]: E1209 12:32:25.604841 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:32:26 crc kubenswrapper[4745]: I1209 12:32:26.159230 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" exitCode=0 Dec 09 12:32:26 crc kubenswrapper[4745]: I1209 12:32:26.159327 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e"} Dec 09 12:32:26 crc kubenswrapper[4745]: I1209 12:32:26.159635 4745 scope.go:117] "RemoveContainer" containerID="8758583a7d7f5e423ff89d3afe5ee82a5df56f0e405db6c36462d9b3f5716aa2" Dec 09 12:32:26 crc kubenswrapper[4745]: I1209 12:32:26.161554 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:32:26 crc kubenswrapper[4745]: E1209 12:32:26.161905 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:32:40 crc kubenswrapper[4745]: I1209 12:32:40.555279 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:32:40 crc kubenswrapper[4745]: E1209 12:32:40.556025 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:32:53 crc kubenswrapper[4745]: I1209 12:32:53.559989 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:32:53 crc kubenswrapper[4745]: E1209 12:32:53.560582 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:33:04 crc kubenswrapper[4745]: I1209 12:33:04.554821 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:33:04 crc kubenswrapper[4745]: E1209 12:33:04.555580 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:33:19 crc kubenswrapper[4745]: I1209 12:33:19.555406 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:33:19 crc kubenswrapper[4745]: E1209 12:33:19.556166 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:33:33 crc kubenswrapper[4745]: I1209 12:33:33.562834 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:33:33 crc kubenswrapper[4745]: E1209 12:33:33.564852 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:33:48 crc kubenswrapper[4745]: I1209 12:33:48.554787 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:33:48 crc kubenswrapper[4745]: E1209 12:33:48.555840 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:34:02 crc kubenswrapper[4745]: I1209 12:34:02.578666 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p2qqn"] Dec 09 12:34:02 crc kubenswrapper[4745]: E1209 12:34:02.581299 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0197cce-ee10-4ff4-b41b-759decbf50db" containerName="collect-profiles" Dec 09 12:34:02 crc kubenswrapper[4745]: I1209 12:34:02.582597 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0197cce-ee10-4ff4-b41b-759decbf50db" containerName="collect-profiles" Dec 09 12:34:02 crc kubenswrapper[4745]: I1209 12:34:02.583110 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0197cce-ee10-4ff4-b41b-759decbf50db" containerName="collect-profiles" Dec 09 12:34:02 crc kubenswrapper[4745]: I1209 12:34:02.585685 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:02 crc kubenswrapper[4745]: I1209 12:34:02.602143 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2qqn"] Dec 09 12:34:02 crc kubenswrapper[4745]: I1209 12:34:02.691136 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41021c1a-7272-4cf1-a1a4-2a4be54476f2-utilities\") pod \"redhat-marketplace-p2qqn\" (UID: \"41021c1a-7272-4cf1-a1a4-2a4be54476f2\") " pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:02 crc kubenswrapper[4745]: I1209 12:34:02.691481 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gcnl\" (UniqueName: \"kubernetes.io/projected/41021c1a-7272-4cf1-a1a4-2a4be54476f2-kube-api-access-5gcnl\") pod \"redhat-marketplace-p2qqn\" (UID: \"41021c1a-7272-4cf1-a1a4-2a4be54476f2\") " pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:02 crc kubenswrapper[4745]: I1209 12:34:02.691546 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41021c1a-7272-4cf1-a1a4-2a4be54476f2-catalog-content\") pod \"redhat-marketplace-p2qqn\" (UID: \"41021c1a-7272-4cf1-a1a4-2a4be54476f2\") " pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:02 crc kubenswrapper[4745]: I1209 12:34:02.792916 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41021c1a-7272-4cf1-a1a4-2a4be54476f2-utilities\") pod \"redhat-marketplace-p2qqn\" (UID: \"41021c1a-7272-4cf1-a1a4-2a4be54476f2\") " pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:02 crc kubenswrapper[4745]: I1209 12:34:02.793006 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gcnl\" (UniqueName: \"kubernetes.io/projected/41021c1a-7272-4cf1-a1a4-2a4be54476f2-kube-api-access-5gcnl\") pod \"redhat-marketplace-p2qqn\" (UID: \"41021c1a-7272-4cf1-a1a4-2a4be54476f2\") " pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:02 crc kubenswrapper[4745]: I1209 12:34:02.793060 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41021c1a-7272-4cf1-a1a4-2a4be54476f2-catalog-content\") pod \"redhat-marketplace-p2qqn\" (UID: \"41021c1a-7272-4cf1-a1a4-2a4be54476f2\") " pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:02 crc kubenswrapper[4745]: I1209 12:34:02.793554 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41021c1a-7272-4cf1-a1a4-2a4be54476f2-utilities\") pod \"redhat-marketplace-p2qqn\" (UID: \"41021c1a-7272-4cf1-a1a4-2a4be54476f2\") " pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:02 crc kubenswrapper[4745]: I1209 12:34:02.793631 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41021c1a-7272-4cf1-a1a4-2a4be54476f2-catalog-content\") pod \"redhat-marketplace-p2qqn\" (UID: \"41021c1a-7272-4cf1-a1a4-2a4be54476f2\") " pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:02 crc kubenswrapper[4745]: I1209 12:34:02.813340 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gcnl\" (UniqueName: \"kubernetes.io/projected/41021c1a-7272-4cf1-a1a4-2a4be54476f2-kube-api-access-5gcnl\") pod \"redhat-marketplace-p2qqn\" (UID: \"41021c1a-7272-4cf1-a1a4-2a4be54476f2\") " pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:02 crc kubenswrapper[4745]: I1209 12:34:02.920004 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:03 crc kubenswrapper[4745]: I1209 12:34:03.377808 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2qqn"] Dec 09 12:34:03 crc kubenswrapper[4745]: I1209 12:34:03.561382 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:34:03 crc kubenswrapper[4745]: E1209 12:34:03.561707 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:34:03 crc kubenswrapper[4745]: I1209 12:34:03.868083 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2qqn" event={"ID":"41021c1a-7272-4cf1-a1a4-2a4be54476f2","Type":"ContainerStarted","Data":"f02c993971cbb3fbaf8a4a7b91ce58563672b624d84b5dc807c942f0947ebe18"} Dec 09 12:34:03 crc kubenswrapper[4745]: I1209 12:34:03.868161 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2qqn" event={"ID":"41021c1a-7272-4cf1-a1a4-2a4be54476f2","Type":"ContainerStarted","Data":"5d8c05907b77a92a3d6b5393a9ddaf262e98f1aa2cce7deb3dbc6a4eae74d85f"} Dec 09 12:34:03 crc kubenswrapper[4745]: I1209 12:34:03.872942 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:34:04 crc kubenswrapper[4745]: I1209 12:34:04.875703 4745 generic.go:334] "Generic (PLEG): container finished" podID="41021c1a-7272-4cf1-a1a4-2a4be54476f2" containerID="f02c993971cbb3fbaf8a4a7b91ce58563672b624d84b5dc807c942f0947ebe18" exitCode=0 Dec 09 12:34:04 crc kubenswrapper[4745]: I1209 12:34:04.875909 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2qqn" event={"ID":"41021c1a-7272-4cf1-a1a4-2a4be54476f2","Type":"ContainerDied","Data":"f02c993971cbb3fbaf8a4a7b91ce58563672b624d84b5dc807c942f0947ebe18"} Dec 09 12:34:05 crc kubenswrapper[4745]: I1209 12:34:05.842125 4745 scope.go:117] "RemoveContainer" containerID="6d405845b82c03be76a8f3414dddf61a5c46b48ddc9c0c1c3e1092971f053d08" Dec 09 12:34:05 crc kubenswrapper[4745]: I1209 12:34:05.861066 4745 scope.go:117] "RemoveContainer" containerID="98ce59a5be5c01bf69a817eb2c32890b79af43edc14ac8cc9f24de029eccbd9b" Dec 09 12:34:05 crc kubenswrapper[4745]: I1209 12:34:05.885460 4745 generic.go:334] "Generic (PLEG): container finished" podID="41021c1a-7272-4cf1-a1a4-2a4be54476f2" containerID="524d1d722475f3a2a153954cb51b103071d9813e964b839556662a00ad9468ba" exitCode=0 Dec 09 12:34:05 crc kubenswrapper[4745]: I1209 12:34:05.885541 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2qqn" event={"ID":"41021c1a-7272-4cf1-a1a4-2a4be54476f2","Type":"ContainerDied","Data":"524d1d722475f3a2a153954cb51b103071d9813e964b839556662a00ad9468ba"} Dec 09 12:34:05 crc kubenswrapper[4745]: I1209 12:34:05.933846 4745 scope.go:117] "RemoveContainer" containerID="3e0b04dbc6b15c3a3d9abe3d1f90e57029eb05a2b8027a268e3bf52fb7f1bef2" Dec 09 12:34:06 crc kubenswrapper[4745]: I1209 12:34:06.895293 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2qqn" event={"ID":"41021c1a-7272-4cf1-a1a4-2a4be54476f2","Type":"ContainerStarted","Data":"2b821e5e58e7c0fb1e1c99bf2c8c35f88da3cb56d01670f7178687aec3cc1899"} Dec 09 12:34:06 crc kubenswrapper[4745]: I1209 12:34:06.913800 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p2qqn" podStartSLOduration=2.509565529 podStartE2EDuration="4.913775988s" podCreationTimestamp="2025-12-09 12:34:02 +0000 UTC" firstStartedPulling="2025-12-09 12:34:03.872641526 +0000 UTC m=+3730.697843050" lastFinishedPulling="2025-12-09 12:34:06.276851985 +0000 UTC m=+3733.102053509" observedRunningTime="2025-12-09 12:34:06.912099653 +0000 UTC m=+3733.737301187" watchObservedRunningTime="2025-12-09 12:34:06.913775988 +0000 UTC m=+3733.738977522" Dec 09 12:34:12 crc kubenswrapper[4745]: I1209 12:34:12.920934 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:12 crc kubenswrapper[4745]: I1209 12:34:12.921495 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:12 crc kubenswrapper[4745]: I1209 12:34:12.968953 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:13 crc kubenswrapper[4745]: I1209 12:34:13.036908 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:13 crc kubenswrapper[4745]: I1209 12:34:13.207664 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2qqn"] Dec 09 12:34:14 crc kubenswrapper[4745]: I1209 12:34:14.956459 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p2qqn" podUID="41021c1a-7272-4cf1-a1a4-2a4be54476f2" containerName="registry-server" containerID="cri-o://2b821e5e58e7c0fb1e1c99bf2c8c35f88da3cb56d01670f7178687aec3cc1899" gracePeriod=2 Dec 09 12:34:15 crc kubenswrapper[4745]: I1209 12:34:15.893337 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:15 crc kubenswrapper[4745]: I1209 12:34:15.968598 4745 generic.go:334] "Generic (PLEG): container finished" podID="41021c1a-7272-4cf1-a1a4-2a4be54476f2" containerID="2b821e5e58e7c0fb1e1c99bf2c8c35f88da3cb56d01670f7178687aec3cc1899" exitCode=0 Dec 09 12:34:15 crc kubenswrapper[4745]: I1209 12:34:15.968664 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2qqn" event={"ID":"41021c1a-7272-4cf1-a1a4-2a4be54476f2","Type":"ContainerDied","Data":"2b821e5e58e7c0fb1e1c99bf2c8c35f88da3cb56d01670f7178687aec3cc1899"} Dec 09 12:34:15 crc kubenswrapper[4745]: I1209 12:34:15.968713 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2qqn" event={"ID":"41021c1a-7272-4cf1-a1a4-2a4be54476f2","Type":"ContainerDied","Data":"5d8c05907b77a92a3d6b5393a9ddaf262e98f1aa2cce7deb3dbc6a4eae74d85f"} Dec 09 12:34:15 crc kubenswrapper[4745]: I1209 12:34:15.968737 4745 scope.go:117] "RemoveContainer" containerID="2b821e5e58e7c0fb1e1c99bf2c8c35f88da3cb56d01670f7178687aec3cc1899" Dec 09 12:34:15 crc kubenswrapper[4745]: I1209 12:34:15.968759 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2qqn" Dec 09 12:34:15 crc kubenswrapper[4745]: I1209 12:34:15.969363 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41021c1a-7272-4cf1-a1a4-2a4be54476f2-catalog-content\") pod \"41021c1a-7272-4cf1-a1a4-2a4be54476f2\" (UID: \"41021c1a-7272-4cf1-a1a4-2a4be54476f2\") " Dec 09 12:34:15 crc kubenswrapper[4745]: I1209 12:34:15.995800 4745 scope.go:117] "RemoveContainer" containerID="524d1d722475f3a2a153954cb51b103071d9813e964b839556662a00ad9468ba" Dec 09 12:34:15 crc kubenswrapper[4745]: I1209 12:34:15.996502 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41021c1a-7272-4cf1-a1a4-2a4be54476f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41021c1a-7272-4cf1-a1a4-2a4be54476f2" (UID: "41021c1a-7272-4cf1-a1a4-2a4be54476f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.011098 4745 scope.go:117] "RemoveContainer" containerID="f02c993971cbb3fbaf8a4a7b91ce58563672b624d84b5dc807c942f0947ebe18" Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.047236 4745 scope.go:117] "RemoveContainer" containerID="2b821e5e58e7c0fb1e1c99bf2c8c35f88da3cb56d01670f7178687aec3cc1899" Dec 09 12:34:16 crc kubenswrapper[4745]: E1209 12:34:16.047915 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b821e5e58e7c0fb1e1c99bf2c8c35f88da3cb56d01670f7178687aec3cc1899\": container with ID starting with 2b821e5e58e7c0fb1e1c99bf2c8c35f88da3cb56d01670f7178687aec3cc1899 not found: ID does not exist" containerID="2b821e5e58e7c0fb1e1c99bf2c8c35f88da3cb56d01670f7178687aec3cc1899" Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.047986 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b821e5e58e7c0fb1e1c99bf2c8c35f88da3cb56d01670f7178687aec3cc1899"} err="failed to get container status \"2b821e5e58e7c0fb1e1c99bf2c8c35f88da3cb56d01670f7178687aec3cc1899\": rpc error: code = NotFound desc = could not find container \"2b821e5e58e7c0fb1e1c99bf2c8c35f88da3cb56d01670f7178687aec3cc1899\": container with ID starting with 2b821e5e58e7c0fb1e1c99bf2c8c35f88da3cb56d01670f7178687aec3cc1899 not found: ID does not exist" Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.048015 4745 scope.go:117] "RemoveContainer" containerID="524d1d722475f3a2a153954cb51b103071d9813e964b839556662a00ad9468ba" Dec 09 12:34:16 crc kubenswrapper[4745]: E1209 12:34:16.048326 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"524d1d722475f3a2a153954cb51b103071d9813e964b839556662a00ad9468ba\": container with ID starting with 524d1d722475f3a2a153954cb51b103071d9813e964b839556662a00ad9468ba not found: ID does not exist" containerID="524d1d722475f3a2a153954cb51b103071d9813e964b839556662a00ad9468ba" Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.048364 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"524d1d722475f3a2a153954cb51b103071d9813e964b839556662a00ad9468ba"} err="failed to get container status \"524d1d722475f3a2a153954cb51b103071d9813e964b839556662a00ad9468ba\": rpc error: code = NotFound desc = could not find container \"524d1d722475f3a2a153954cb51b103071d9813e964b839556662a00ad9468ba\": container with ID starting with 524d1d722475f3a2a153954cb51b103071d9813e964b839556662a00ad9468ba not found: ID does not exist" Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.048384 4745 scope.go:117] "RemoveContainer" containerID="f02c993971cbb3fbaf8a4a7b91ce58563672b624d84b5dc807c942f0947ebe18" Dec 09 12:34:16 crc kubenswrapper[4745]: E1209 12:34:16.048785 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f02c993971cbb3fbaf8a4a7b91ce58563672b624d84b5dc807c942f0947ebe18\": container with ID starting with f02c993971cbb3fbaf8a4a7b91ce58563672b624d84b5dc807c942f0947ebe18 not found: ID does not exist" containerID="f02c993971cbb3fbaf8a4a7b91ce58563672b624d84b5dc807c942f0947ebe18" Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.048819 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f02c993971cbb3fbaf8a4a7b91ce58563672b624d84b5dc807c942f0947ebe18"} err="failed to get container status \"f02c993971cbb3fbaf8a4a7b91ce58563672b624d84b5dc807c942f0947ebe18\": rpc error: code = NotFound desc = could not find container \"f02c993971cbb3fbaf8a4a7b91ce58563672b624d84b5dc807c942f0947ebe18\": container with ID starting with f02c993971cbb3fbaf8a4a7b91ce58563672b624d84b5dc807c942f0947ebe18 not found: ID does not exist" Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.070337 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41021c1a-7272-4cf1-a1a4-2a4be54476f2-utilities\") pod \"41021c1a-7272-4cf1-a1a4-2a4be54476f2\" (UID: \"41021c1a-7272-4cf1-a1a4-2a4be54476f2\") " Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.070688 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gcnl\" (UniqueName: \"kubernetes.io/projected/41021c1a-7272-4cf1-a1a4-2a4be54476f2-kube-api-access-5gcnl\") pod \"41021c1a-7272-4cf1-a1a4-2a4be54476f2\" (UID: \"41021c1a-7272-4cf1-a1a4-2a4be54476f2\") " Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.070871 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41021c1a-7272-4cf1-a1a4-2a4be54476f2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.071331 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41021c1a-7272-4cf1-a1a4-2a4be54476f2-utilities" (OuterVolumeSpecName: "utilities") pod "41021c1a-7272-4cf1-a1a4-2a4be54476f2" (UID: "41021c1a-7272-4cf1-a1a4-2a4be54476f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.076571 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41021c1a-7272-4cf1-a1a4-2a4be54476f2-kube-api-access-5gcnl" (OuterVolumeSpecName: "kube-api-access-5gcnl") pod "41021c1a-7272-4cf1-a1a4-2a4be54476f2" (UID: "41021c1a-7272-4cf1-a1a4-2a4be54476f2"). InnerVolumeSpecName "kube-api-access-5gcnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.172410 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41021c1a-7272-4cf1-a1a4-2a4be54476f2-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.172449 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gcnl\" (UniqueName: \"kubernetes.io/projected/41021c1a-7272-4cf1-a1a4-2a4be54476f2-kube-api-access-5gcnl\") on node \"crc\" DevicePath \"\"" Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.302915 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2qqn"] Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.311190 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2qqn"] Dec 09 12:34:16 crc kubenswrapper[4745]: I1209 12:34:16.554886 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:34:16 crc kubenswrapper[4745]: E1209 12:34:16.555116 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:34:17 crc kubenswrapper[4745]: I1209 12:34:17.563988 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41021c1a-7272-4cf1-a1a4-2a4be54476f2" path="/var/lib/kubelet/pods/41021c1a-7272-4cf1-a1a4-2a4be54476f2/volumes" Dec 09 12:34:31 crc kubenswrapper[4745]: I1209 12:34:31.555267 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:34:31 crc kubenswrapper[4745]: E1209 12:34:31.556230 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:34:43 crc kubenswrapper[4745]: I1209 12:34:43.560493 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:34:43 crc kubenswrapper[4745]: E1209 12:34:43.561462 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:34:58 crc kubenswrapper[4745]: I1209 12:34:58.555330 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:34:58 crc kubenswrapper[4745]: E1209 12:34:58.556037 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:35:10 crc kubenswrapper[4745]: I1209 12:35:10.555068 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:35:10 crc kubenswrapper[4745]: E1209 12:35:10.556637 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:35:25 crc kubenswrapper[4745]: I1209 12:35:25.554734 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:35:25 crc kubenswrapper[4745]: E1209 12:35:25.557189 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:35:37 crc kubenswrapper[4745]: I1209 12:35:37.554401 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:35:37 crc kubenswrapper[4745]: E1209 12:35:37.555154 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:35:48 crc kubenswrapper[4745]: I1209 12:35:48.554680 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:35:48 crc kubenswrapper[4745]: E1209 12:35:48.555339 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:35:59 crc kubenswrapper[4745]: I1209 12:35:59.555670 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:35:59 crc kubenswrapper[4745]: E1209 12:35:59.556539 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:36:11 crc kubenswrapper[4745]: I1209 12:36:11.554915 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:36:11 crc kubenswrapper[4745]: E1209 12:36:11.555553 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:36:24 crc kubenswrapper[4745]: I1209 12:36:24.554602 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:36:24 crc kubenswrapper[4745]: E1209 12:36:24.556677 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:36:38 crc kubenswrapper[4745]: I1209 12:36:38.554474 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:36:38 crc kubenswrapper[4745]: E1209 12:36:38.555185 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:36:49 crc kubenswrapper[4745]: I1209 12:36:49.555441 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:36:49 crc kubenswrapper[4745]: E1209 12:36:49.556012 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:37:00 crc kubenswrapper[4745]: I1209 12:37:00.555278 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:37:00 crc kubenswrapper[4745]: E1209 12:37:00.555989 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:37:11 crc kubenswrapper[4745]: I1209 12:37:11.554996 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:37:11 crc kubenswrapper[4745]: E1209 12:37:11.555713 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:37:23 crc kubenswrapper[4745]: I1209 12:37:23.566464 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:37:23 crc kubenswrapper[4745]: E1209 12:37:23.569065 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:37:37 crc kubenswrapper[4745]: I1209 12:37:37.554765 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:37:38 crc kubenswrapper[4745]: I1209 12:37:38.521523 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"5909a9da486347126f8bae70162a0d45dcad6ad4bfac4440ff4baddf476ff4a6"} Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.532470 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bqmcn"] Dec 09 12:37:47 crc kubenswrapper[4745]: E1209 12:37:47.533332 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41021c1a-7272-4cf1-a1a4-2a4be54476f2" containerName="extract-utilities" Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.533346 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="41021c1a-7272-4cf1-a1a4-2a4be54476f2" containerName="extract-utilities" Dec 09 12:37:47 crc kubenswrapper[4745]: E1209 12:37:47.533359 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41021c1a-7272-4cf1-a1a4-2a4be54476f2" containerName="extract-content" Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.533365 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="41021c1a-7272-4cf1-a1a4-2a4be54476f2" containerName="extract-content" Dec 09 12:37:47 crc kubenswrapper[4745]: E1209 12:37:47.533385 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41021c1a-7272-4cf1-a1a4-2a4be54476f2" containerName="registry-server" Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.533391 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="41021c1a-7272-4cf1-a1a4-2a4be54476f2" containerName="registry-server" Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.533586 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="41021c1a-7272-4cf1-a1a4-2a4be54476f2" containerName="registry-server" Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.534693 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.540329 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bqmcn"] Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.574638 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580557fe-905e-42ea-aa5d-f1d83b340e00-utilities\") pod \"redhat-operators-bqmcn\" (UID: \"580557fe-905e-42ea-aa5d-f1d83b340e00\") " pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.574989 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcdj5\" (UniqueName: \"kubernetes.io/projected/580557fe-905e-42ea-aa5d-f1d83b340e00-kube-api-access-pcdj5\") pod \"redhat-operators-bqmcn\" (UID: \"580557fe-905e-42ea-aa5d-f1d83b340e00\") " pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.575116 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580557fe-905e-42ea-aa5d-f1d83b340e00-catalog-content\") pod \"redhat-operators-bqmcn\" (UID: \"580557fe-905e-42ea-aa5d-f1d83b340e00\") " pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.676425 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcdj5\" (UniqueName: \"kubernetes.io/projected/580557fe-905e-42ea-aa5d-f1d83b340e00-kube-api-access-pcdj5\") pod \"redhat-operators-bqmcn\" (UID: \"580557fe-905e-42ea-aa5d-f1d83b340e00\") " pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.676538 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580557fe-905e-42ea-aa5d-f1d83b340e00-catalog-content\") pod \"redhat-operators-bqmcn\" (UID: \"580557fe-905e-42ea-aa5d-f1d83b340e00\") " pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.676563 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580557fe-905e-42ea-aa5d-f1d83b340e00-utilities\") pod \"redhat-operators-bqmcn\" (UID: \"580557fe-905e-42ea-aa5d-f1d83b340e00\") " pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.677016 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580557fe-905e-42ea-aa5d-f1d83b340e00-utilities\") pod \"redhat-operators-bqmcn\" (UID: \"580557fe-905e-42ea-aa5d-f1d83b340e00\") " pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.677076 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580557fe-905e-42ea-aa5d-f1d83b340e00-catalog-content\") pod \"redhat-operators-bqmcn\" (UID: \"580557fe-905e-42ea-aa5d-f1d83b340e00\") " pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.696529 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcdj5\" (UniqueName: \"kubernetes.io/projected/580557fe-905e-42ea-aa5d-f1d83b340e00-kube-api-access-pcdj5\") pod \"redhat-operators-bqmcn\" (UID: \"580557fe-905e-42ea-aa5d-f1d83b340e00\") " pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:37:47 crc kubenswrapper[4745]: I1209 12:37:47.854892 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:37:48 crc kubenswrapper[4745]: I1209 12:37:48.602749 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bqmcn"] Dec 09 12:37:49 crc kubenswrapper[4745]: I1209 12:37:49.606003 4745 generic.go:334] "Generic (PLEG): container finished" podID="580557fe-905e-42ea-aa5d-f1d83b340e00" containerID="9ea519ded45316f84ec4b7c101e3a07fb30bc1a7650adf10697aaf22cb1ce7da" exitCode=0 Dec 09 12:37:49 crc kubenswrapper[4745]: I1209 12:37:49.606414 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqmcn" event={"ID":"580557fe-905e-42ea-aa5d-f1d83b340e00","Type":"ContainerDied","Data":"9ea519ded45316f84ec4b7c101e3a07fb30bc1a7650adf10697aaf22cb1ce7da"} Dec 09 12:37:49 crc kubenswrapper[4745]: I1209 12:37:49.606678 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqmcn" event={"ID":"580557fe-905e-42ea-aa5d-f1d83b340e00","Type":"ContainerStarted","Data":"6c8c77cfb14a446e70558ddb06a0908535f1b1ce190cbb45fc719524da918df2"} Dec 09 12:37:50 crc kubenswrapper[4745]: I1209 12:37:50.617940 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqmcn" event={"ID":"580557fe-905e-42ea-aa5d-f1d83b340e00","Type":"ContainerStarted","Data":"89437d7fed6e4e30396160af0383950d0948dd4a1ae21be352c363a1b68c8c75"} Dec 09 12:37:51 crc kubenswrapper[4745]: I1209 12:37:51.630149 4745 generic.go:334] "Generic (PLEG): container finished" podID="580557fe-905e-42ea-aa5d-f1d83b340e00" containerID="89437d7fed6e4e30396160af0383950d0948dd4a1ae21be352c363a1b68c8c75" exitCode=0 Dec 09 12:37:51 crc kubenswrapper[4745]: I1209 12:37:51.630279 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqmcn" event={"ID":"580557fe-905e-42ea-aa5d-f1d83b340e00","Type":"ContainerDied","Data":"89437d7fed6e4e30396160af0383950d0948dd4a1ae21be352c363a1b68c8c75"} Dec 09 12:37:52 crc kubenswrapper[4745]: I1209 12:37:52.638962 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqmcn" event={"ID":"580557fe-905e-42ea-aa5d-f1d83b340e00","Type":"ContainerStarted","Data":"c96f27dc26dd728a1534bfba49d1ccc013dffd300404a28a20f2ec99739b4ed2"} Dec 09 12:37:57 crc kubenswrapper[4745]: I1209 12:37:57.855481 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:37:57 crc kubenswrapper[4745]: I1209 12:37:57.855982 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:37:57 crc kubenswrapper[4745]: I1209 12:37:57.900760 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:37:57 crc kubenswrapper[4745]: I1209 12:37:57.933341 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bqmcn" podStartSLOduration=8.537094865 podStartE2EDuration="10.933309588s" podCreationTimestamp="2025-12-09 12:37:47 +0000 UTC" firstStartedPulling="2025-12-09 12:37:49.608135022 +0000 UTC m=+3956.433336546" lastFinishedPulling="2025-12-09 12:37:52.004349745 +0000 UTC m=+3958.829551269" observedRunningTime="2025-12-09 12:37:52.661566037 +0000 UTC m=+3959.486767581" watchObservedRunningTime="2025-12-09 12:37:57.933309588 +0000 UTC m=+3964.758511122" Dec 09 12:37:58 crc kubenswrapper[4745]: I1209 12:37:58.728028 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:37:58 crc kubenswrapper[4745]: I1209 12:37:58.775329 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bqmcn"] Dec 09 12:38:00 crc kubenswrapper[4745]: I1209 12:38:00.699493 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bqmcn" podUID="580557fe-905e-42ea-aa5d-f1d83b340e00" containerName="registry-server" containerID="cri-o://c96f27dc26dd728a1534bfba49d1ccc013dffd300404a28a20f2ec99739b4ed2" gracePeriod=2 Dec 09 12:38:03 crc kubenswrapper[4745]: I1209 12:38:03.726947 4745 generic.go:334] "Generic (PLEG): container finished" podID="580557fe-905e-42ea-aa5d-f1d83b340e00" containerID="c96f27dc26dd728a1534bfba49d1ccc013dffd300404a28a20f2ec99739b4ed2" exitCode=0 Dec 09 12:38:03 crc kubenswrapper[4745]: I1209 12:38:03.727032 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqmcn" event={"ID":"580557fe-905e-42ea-aa5d-f1d83b340e00","Type":"ContainerDied","Data":"c96f27dc26dd728a1534bfba49d1ccc013dffd300404a28a20f2ec99739b4ed2"} Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.060111 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.126333 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580557fe-905e-42ea-aa5d-f1d83b340e00-utilities\") pod \"580557fe-905e-42ea-aa5d-f1d83b340e00\" (UID: \"580557fe-905e-42ea-aa5d-f1d83b340e00\") " Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.126379 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580557fe-905e-42ea-aa5d-f1d83b340e00-catalog-content\") pod \"580557fe-905e-42ea-aa5d-f1d83b340e00\" (UID: \"580557fe-905e-42ea-aa5d-f1d83b340e00\") " Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.126448 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcdj5\" (UniqueName: \"kubernetes.io/projected/580557fe-905e-42ea-aa5d-f1d83b340e00-kube-api-access-pcdj5\") pod \"580557fe-905e-42ea-aa5d-f1d83b340e00\" (UID: \"580557fe-905e-42ea-aa5d-f1d83b340e00\") " Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.127695 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/580557fe-905e-42ea-aa5d-f1d83b340e00-utilities" (OuterVolumeSpecName: "utilities") pod "580557fe-905e-42ea-aa5d-f1d83b340e00" (UID: "580557fe-905e-42ea-aa5d-f1d83b340e00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.133996 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/580557fe-905e-42ea-aa5d-f1d83b340e00-kube-api-access-pcdj5" (OuterVolumeSpecName: "kube-api-access-pcdj5") pod "580557fe-905e-42ea-aa5d-f1d83b340e00" (UID: "580557fe-905e-42ea-aa5d-f1d83b340e00"). InnerVolumeSpecName "kube-api-access-pcdj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.229081 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcdj5\" (UniqueName: \"kubernetes.io/projected/580557fe-905e-42ea-aa5d-f1d83b340e00-kube-api-access-pcdj5\") on node \"crc\" DevicePath \"\"" Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.229116 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580557fe-905e-42ea-aa5d-f1d83b340e00-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.243369 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/580557fe-905e-42ea-aa5d-f1d83b340e00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "580557fe-905e-42ea-aa5d-f1d83b340e00" (UID: "580557fe-905e-42ea-aa5d-f1d83b340e00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.330462 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580557fe-905e-42ea-aa5d-f1d83b340e00-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.741459 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqmcn" event={"ID":"580557fe-905e-42ea-aa5d-f1d83b340e00","Type":"ContainerDied","Data":"6c8c77cfb14a446e70558ddb06a0908535f1b1ce190cbb45fc719524da918df2"} Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.741537 4745 scope.go:117] "RemoveContainer" containerID="c96f27dc26dd728a1534bfba49d1ccc013dffd300404a28a20f2ec99739b4ed2" Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.741706 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bqmcn" Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.773812 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bqmcn"] Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.774827 4745 scope.go:117] "RemoveContainer" containerID="89437d7fed6e4e30396160af0383950d0948dd4a1ae21be352c363a1b68c8c75" Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.779013 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bqmcn"] Dec 09 12:38:04 crc kubenswrapper[4745]: I1209 12:38:04.796636 4745 scope.go:117] "RemoveContainer" containerID="9ea519ded45316f84ec4b7c101e3a07fb30bc1a7650adf10697aaf22cb1ce7da" Dec 09 12:38:05 crc kubenswrapper[4745]: I1209 12:38:05.564397 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="580557fe-905e-42ea-aa5d-f1d83b340e00" path="/var/lib/kubelet/pods/580557fe-905e-42ea-aa5d-f1d83b340e00/volumes" Dec 09 12:38:52 crc kubenswrapper[4745]: I1209 12:38:52.865687 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pnhkk"] Dec 09 12:38:52 crc kubenswrapper[4745]: E1209 12:38:52.866820 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580557fe-905e-42ea-aa5d-f1d83b340e00" containerName="extract-utilities" Dec 09 12:38:52 crc kubenswrapper[4745]: I1209 12:38:52.866847 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="580557fe-905e-42ea-aa5d-f1d83b340e00" containerName="extract-utilities" Dec 09 12:38:52 crc kubenswrapper[4745]: E1209 12:38:52.866876 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580557fe-905e-42ea-aa5d-f1d83b340e00" containerName="extract-content" Dec 09 12:38:52 crc kubenswrapper[4745]: I1209 12:38:52.866887 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="580557fe-905e-42ea-aa5d-f1d83b340e00" containerName="extract-content" Dec 09 12:38:52 crc kubenswrapper[4745]: E1209 12:38:52.866904 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580557fe-905e-42ea-aa5d-f1d83b340e00" containerName="registry-server" Dec 09 12:38:52 crc kubenswrapper[4745]: I1209 12:38:52.866912 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="580557fe-905e-42ea-aa5d-f1d83b340e00" containerName="registry-server" Dec 09 12:38:52 crc kubenswrapper[4745]: I1209 12:38:52.867098 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="580557fe-905e-42ea-aa5d-f1d83b340e00" containerName="registry-server" Dec 09 12:38:52 crc kubenswrapper[4745]: I1209 12:38:52.868462 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:38:52 crc kubenswrapper[4745]: I1209 12:38:52.882171 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pnhkk"] Dec 09 12:38:52 crc kubenswrapper[4745]: I1209 12:38:52.911408 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/673ca696-765a-47fd-a3f2-70ba1e01e9c5-catalog-content\") pod \"community-operators-pnhkk\" (UID: \"673ca696-765a-47fd-a3f2-70ba1e01e9c5\") " pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:38:52 crc kubenswrapper[4745]: I1209 12:38:52.911458 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ksl8\" (UniqueName: \"kubernetes.io/projected/673ca696-765a-47fd-a3f2-70ba1e01e9c5-kube-api-access-7ksl8\") pod \"community-operators-pnhkk\" (UID: \"673ca696-765a-47fd-a3f2-70ba1e01e9c5\") " pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:38:52 crc kubenswrapper[4745]: I1209 12:38:52.911490 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/673ca696-765a-47fd-a3f2-70ba1e01e9c5-utilities\") pod \"community-operators-pnhkk\" (UID: \"673ca696-765a-47fd-a3f2-70ba1e01e9c5\") " pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:38:53 crc kubenswrapper[4745]: I1209 12:38:53.013716 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/673ca696-765a-47fd-a3f2-70ba1e01e9c5-catalog-content\") pod \"community-operators-pnhkk\" (UID: \"673ca696-765a-47fd-a3f2-70ba1e01e9c5\") " pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:38:53 crc kubenswrapper[4745]: I1209 12:38:53.014030 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ksl8\" (UniqueName: \"kubernetes.io/projected/673ca696-765a-47fd-a3f2-70ba1e01e9c5-kube-api-access-7ksl8\") pod \"community-operators-pnhkk\" (UID: \"673ca696-765a-47fd-a3f2-70ba1e01e9c5\") " pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:38:53 crc kubenswrapper[4745]: I1209 12:38:53.014089 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/673ca696-765a-47fd-a3f2-70ba1e01e9c5-utilities\") pod \"community-operators-pnhkk\" (UID: \"673ca696-765a-47fd-a3f2-70ba1e01e9c5\") " pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:38:53 crc kubenswrapper[4745]: I1209 12:38:53.014407 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/673ca696-765a-47fd-a3f2-70ba1e01e9c5-catalog-content\") pod \"community-operators-pnhkk\" (UID: \"673ca696-765a-47fd-a3f2-70ba1e01e9c5\") " pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:38:53 crc kubenswrapper[4745]: I1209 12:38:53.014684 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/673ca696-765a-47fd-a3f2-70ba1e01e9c5-utilities\") pod \"community-operators-pnhkk\" (UID: \"673ca696-765a-47fd-a3f2-70ba1e01e9c5\") " pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:38:53 crc kubenswrapper[4745]: I1209 12:38:53.049537 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ksl8\" (UniqueName: \"kubernetes.io/projected/673ca696-765a-47fd-a3f2-70ba1e01e9c5-kube-api-access-7ksl8\") pod \"community-operators-pnhkk\" (UID: \"673ca696-765a-47fd-a3f2-70ba1e01e9c5\") " pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:38:53 crc kubenswrapper[4745]: I1209 12:38:53.229668 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:38:53 crc kubenswrapper[4745]: I1209 12:38:53.707209 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pnhkk"] Dec 09 12:38:54 crc kubenswrapper[4745]: I1209 12:38:54.072598 4745 generic.go:334] "Generic (PLEG): container finished" podID="673ca696-765a-47fd-a3f2-70ba1e01e9c5" containerID="26add0c6e8d7025444ef02c28bf6ec66d896614aadef298d8f70cdb985754963" exitCode=0 Dec 09 12:38:54 crc kubenswrapper[4745]: I1209 12:38:54.072645 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pnhkk" event={"ID":"673ca696-765a-47fd-a3f2-70ba1e01e9c5","Type":"ContainerDied","Data":"26add0c6e8d7025444ef02c28bf6ec66d896614aadef298d8f70cdb985754963"} Dec 09 12:38:54 crc kubenswrapper[4745]: I1209 12:38:54.072673 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pnhkk" event={"ID":"673ca696-765a-47fd-a3f2-70ba1e01e9c5","Type":"ContainerStarted","Data":"7f7b5aeb223f6d7025168e2e351ffd15d1952901fcad46f2caa89ae2652dc1b0"} Dec 09 12:38:56 crc kubenswrapper[4745]: I1209 12:38:56.090417 4745 generic.go:334] "Generic (PLEG): container finished" podID="673ca696-765a-47fd-a3f2-70ba1e01e9c5" containerID="a50b9942db1a9fedb208f62390c15f427370b16d177fb24b502f2cb20a2a0068" exitCode=0 Dec 09 12:38:56 crc kubenswrapper[4745]: I1209 12:38:56.091019 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pnhkk" event={"ID":"673ca696-765a-47fd-a3f2-70ba1e01e9c5","Type":"ContainerDied","Data":"a50b9942db1a9fedb208f62390c15f427370b16d177fb24b502f2cb20a2a0068"} Dec 09 12:38:57 crc kubenswrapper[4745]: I1209 12:38:57.099472 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pnhkk" event={"ID":"673ca696-765a-47fd-a3f2-70ba1e01e9c5","Type":"ContainerStarted","Data":"25602a2822debf04539360c53ad299b5d91f727639a021a474b788bf3b6f6b24"} Dec 09 12:38:57 crc kubenswrapper[4745]: I1209 12:38:57.119192 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pnhkk" podStartSLOduration=2.731430054 podStartE2EDuration="5.11917067s" podCreationTimestamp="2025-12-09 12:38:52 +0000 UTC" firstStartedPulling="2025-12-09 12:38:54.07449031 +0000 UTC m=+4020.899691834" lastFinishedPulling="2025-12-09 12:38:56.462230926 +0000 UTC m=+4023.287432450" observedRunningTime="2025-12-09 12:38:57.11845099 +0000 UTC m=+4023.943652524" watchObservedRunningTime="2025-12-09 12:38:57.11917067 +0000 UTC m=+4023.944372204" Dec 09 12:38:58 crc kubenswrapper[4745]: I1209 12:38:58.649778 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nxkqb"] Dec 09 12:38:58 crc kubenswrapper[4745]: I1209 12:38:58.653206 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:38:58 crc kubenswrapper[4745]: I1209 12:38:58.660837 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nxkqb"] Dec 09 12:38:58 crc kubenswrapper[4745]: I1209 12:38:58.696248 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc9d4b60-89c6-40e2-8676-861de614a1ce-catalog-content\") pod \"certified-operators-nxkqb\" (UID: \"cc9d4b60-89c6-40e2-8676-861de614a1ce\") " pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:38:58 crc kubenswrapper[4745]: I1209 12:38:58.696599 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc9d4b60-89c6-40e2-8676-861de614a1ce-utilities\") pod \"certified-operators-nxkqb\" (UID: \"cc9d4b60-89c6-40e2-8676-861de614a1ce\") " pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:38:58 crc kubenswrapper[4745]: I1209 12:38:58.696724 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whj8k\" (UniqueName: \"kubernetes.io/projected/cc9d4b60-89c6-40e2-8676-861de614a1ce-kube-api-access-whj8k\") pod \"certified-operators-nxkqb\" (UID: \"cc9d4b60-89c6-40e2-8676-861de614a1ce\") " pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:38:58 crc kubenswrapper[4745]: I1209 12:38:58.797843 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc9d4b60-89c6-40e2-8676-861de614a1ce-catalog-content\") pod \"certified-operators-nxkqb\" (UID: \"cc9d4b60-89c6-40e2-8676-861de614a1ce\") " pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:38:58 crc kubenswrapper[4745]: I1209 12:38:58.797905 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc9d4b60-89c6-40e2-8676-861de614a1ce-utilities\") pod \"certified-operators-nxkqb\" (UID: \"cc9d4b60-89c6-40e2-8676-861de614a1ce\") " pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:38:58 crc kubenswrapper[4745]: I1209 12:38:58.797934 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whj8k\" (UniqueName: \"kubernetes.io/projected/cc9d4b60-89c6-40e2-8676-861de614a1ce-kube-api-access-whj8k\") pod \"certified-operators-nxkqb\" (UID: \"cc9d4b60-89c6-40e2-8676-861de614a1ce\") " pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:38:58 crc kubenswrapper[4745]: I1209 12:38:58.798414 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc9d4b60-89c6-40e2-8676-861de614a1ce-catalog-content\") pod \"certified-operators-nxkqb\" (UID: \"cc9d4b60-89c6-40e2-8676-861de614a1ce\") " pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:38:58 crc kubenswrapper[4745]: I1209 12:38:58.798780 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc9d4b60-89c6-40e2-8676-861de614a1ce-utilities\") pod \"certified-operators-nxkqb\" (UID: \"cc9d4b60-89c6-40e2-8676-861de614a1ce\") " pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:38:58 crc kubenswrapper[4745]: I1209 12:38:58.819181 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whj8k\" (UniqueName: \"kubernetes.io/projected/cc9d4b60-89c6-40e2-8676-861de614a1ce-kube-api-access-whj8k\") pod \"certified-operators-nxkqb\" (UID: \"cc9d4b60-89c6-40e2-8676-861de614a1ce\") " pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:38:58 crc kubenswrapper[4745]: I1209 12:38:58.970590 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:38:59 crc kubenswrapper[4745]: I1209 12:38:59.601249 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nxkqb"] Dec 09 12:39:00 crc kubenswrapper[4745]: I1209 12:39:00.175044 4745 generic.go:334] "Generic (PLEG): container finished" podID="cc9d4b60-89c6-40e2-8676-861de614a1ce" containerID="17a773a759c18e25d7e6965d837020cce496387fe9cb5c6d29d416e6ac825704" exitCode=0 Dec 09 12:39:00 crc kubenswrapper[4745]: I1209 12:39:00.175099 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxkqb" event={"ID":"cc9d4b60-89c6-40e2-8676-861de614a1ce","Type":"ContainerDied","Data":"17a773a759c18e25d7e6965d837020cce496387fe9cb5c6d29d416e6ac825704"} Dec 09 12:39:00 crc kubenswrapper[4745]: I1209 12:39:00.175136 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxkqb" event={"ID":"cc9d4b60-89c6-40e2-8676-861de614a1ce","Type":"ContainerStarted","Data":"154e7447888f726fc7716a605c14ffe4e865b61d874401e5e5c8ec58689f6d78"} Dec 09 12:39:02 crc kubenswrapper[4745]: I1209 12:39:02.192009 4745 generic.go:334] "Generic (PLEG): container finished" podID="cc9d4b60-89c6-40e2-8676-861de614a1ce" containerID="911bbe1a50126300ba14e1a19ab61c06d830050e8dead49448fb5e40a400614c" exitCode=0 Dec 09 12:39:02 crc kubenswrapper[4745]: I1209 12:39:02.192354 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxkqb" event={"ID":"cc9d4b60-89c6-40e2-8676-861de614a1ce","Type":"ContainerDied","Data":"911bbe1a50126300ba14e1a19ab61c06d830050e8dead49448fb5e40a400614c"} Dec 09 12:39:03 crc kubenswrapper[4745]: I1209 12:39:03.216786 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxkqb" event={"ID":"cc9d4b60-89c6-40e2-8676-861de614a1ce","Type":"ContainerStarted","Data":"9bc5bb3657008983e04540671316fff5f473b059bab3383315ed56955ac63b33"} Dec 09 12:39:03 crc kubenswrapper[4745]: I1209 12:39:03.230896 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:39:03 crc kubenswrapper[4745]: I1209 12:39:03.230971 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:39:03 crc kubenswrapper[4745]: I1209 12:39:03.239151 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nxkqb" podStartSLOduration=2.831680387 podStartE2EDuration="5.239130416s" podCreationTimestamp="2025-12-09 12:38:58 +0000 UTC" firstStartedPulling="2025-12-09 12:39:00.178251288 +0000 UTC m=+4027.003452812" lastFinishedPulling="2025-12-09 12:39:02.585701317 +0000 UTC m=+4029.410902841" observedRunningTime="2025-12-09 12:39:03.236741362 +0000 UTC m=+4030.061942896" watchObservedRunningTime="2025-12-09 12:39:03.239130416 +0000 UTC m=+4030.064331940" Dec 09 12:39:03 crc kubenswrapper[4745]: I1209 12:39:03.299981 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:39:04 crc kubenswrapper[4745]: I1209 12:39:04.265156 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:39:04 crc kubenswrapper[4745]: I1209 12:39:04.837260 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pnhkk"] Dec 09 12:39:06 crc kubenswrapper[4745]: I1209 12:39:06.238245 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pnhkk" podUID="673ca696-765a-47fd-a3f2-70ba1e01e9c5" containerName="registry-server" containerID="cri-o://25602a2822debf04539360c53ad299b5d91f727639a021a474b788bf3b6f6b24" gracePeriod=2 Dec 09 12:39:07 crc kubenswrapper[4745]: I1209 12:39:07.249326 4745 generic.go:334] "Generic (PLEG): container finished" podID="673ca696-765a-47fd-a3f2-70ba1e01e9c5" containerID="25602a2822debf04539360c53ad299b5d91f727639a021a474b788bf3b6f6b24" exitCode=0 Dec 09 12:39:07 crc kubenswrapper[4745]: I1209 12:39:07.249419 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pnhkk" event={"ID":"673ca696-765a-47fd-a3f2-70ba1e01e9c5","Type":"ContainerDied","Data":"25602a2822debf04539360c53ad299b5d91f727639a021a474b788bf3b6f6b24"} Dec 09 12:39:07 crc kubenswrapper[4745]: I1209 12:39:07.330779 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:39:07 crc kubenswrapper[4745]: I1209 12:39:07.473439 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ksl8\" (UniqueName: \"kubernetes.io/projected/673ca696-765a-47fd-a3f2-70ba1e01e9c5-kube-api-access-7ksl8\") pod \"673ca696-765a-47fd-a3f2-70ba1e01e9c5\" (UID: \"673ca696-765a-47fd-a3f2-70ba1e01e9c5\") " Dec 09 12:39:07 crc kubenswrapper[4745]: I1209 12:39:07.473639 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/673ca696-765a-47fd-a3f2-70ba1e01e9c5-utilities\") pod \"673ca696-765a-47fd-a3f2-70ba1e01e9c5\" (UID: \"673ca696-765a-47fd-a3f2-70ba1e01e9c5\") " Dec 09 12:39:07 crc kubenswrapper[4745]: I1209 12:39:07.473682 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/673ca696-765a-47fd-a3f2-70ba1e01e9c5-catalog-content\") pod \"673ca696-765a-47fd-a3f2-70ba1e01e9c5\" (UID: \"673ca696-765a-47fd-a3f2-70ba1e01e9c5\") " Dec 09 12:39:07 crc kubenswrapper[4745]: I1209 12:39:07.474715 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/673ca696-765a-47fd-a3f2-70ba1e01e9c5-utilities" (OuterVolumeSpecName: "utilities") pod "673ca696-765a-47fd-a3f2-70ba1e01e9c5" (UID: "673ca696-765a-47fd-a3f2-70ba1e01e9c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:39:07 crc kubenswrapper[4745]: I1209 12:39:07.526637 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/673ca696-765a-47fd-a3f2-70ba1e01e9c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "673ca696-765a-47fd-a3f2-70ba1e01e9c5" (UID: "673ca696-765a-47fd-a3f2-70ba1e01e9c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:39:07 crc kubenswrapper[4745]: I1209 12:39:07.575003 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/673ca696-765a-47fd-a3f2-70ba1e01e9c5-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:39:07 crc kubenswrapper[4745]: I1209 12:39:07.575055 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/673ca696-765a-47fd-a3f2-70ba1e01e9c5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:39:07 crc kubenswrapper[4745]: I1209 12:39:07.760343 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/673ca696-765a-47fd-a3f2-70ba1e01e9c5-kube-api-access-7ksl8" (OuterVolumeSpecName: "kube-api-access-7ksl8") pod "673ca696-765a-47fd-a3f2-70ba1e01e9c5" (UID: "673ca696-765a-47fd-a3f2-70ba1e01e9c5"). InnerVolumeSpecName "kube-api-access-7ksl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:39:07 crc kubenswrapper[4745]: I1209 12:39:07.777459 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ksl8\" (UniqueName: \"kubernetes.io/projected/673ca696-765a-47fd-a3f2-70ba1e01e9c5-kube-api-access-7ksl8\") on node \"crc\" DevicePath \"\"" Dec 09 12:39:08 crc kubenswrapper[4745]: I1209 12:39:08.257921 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pnhkk" event={"ID":"673ca696-765a-47fd-a3f2-70ba1e01e9c5","Type":"ContainerDied","Data":"7f7b5aeb223f6d7025168e2e351ffd15d1952901fcad46f2caa89ae2652dc1b0"} Dec 09 12:39:08 crc kubenswrapper[4745]: I1209 12:39:08.258012 4745 scope.go:117] "RemoveContainer" containerID="25602a2822debf04539360c53ad299b5d91f727639a021a474b788bf3b6f6b24" Dec 09 12:39:08 crc kubenswrapper[4745]: I1209 12:39:08.258053 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pnhkk" Dec 09 12:39:08 crc kubenswrapper[4745]: I1209 12:39:08.276461 4745 scope.go:117] "RemoveContainer" containerID="a50b9942db1a9fedb208f62390c15f427370b16d177fb24b502f2cb20a2a0068" Dec 09 12:39:08 crc kubenswrapper[4745]: I1209 12:39:08.305880 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pnhkk"] Dec 09 12:39:08 crc kubenswrapper[4745]: I1209 12:39:08.307023 4745 scope.go:117] "RemoveContainer" containerID="26add0c6e8d7025444ef02c28bf6ec66d896614aadef298d8f70cdb985754963" Dec 09 12:39:08 crc kubenswrapper[4745]: I1209 12:39:08.314103 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pnhkk"] Dec 09 12:39:08 crc kubenswrapper[4745]: I1209 12:39:08.971321 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:39:08 crc kubenswrapper[4745]: I1209 12:39:08.971369 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:39:09 crc kubenswrapper[4745]: I1209 12:39:09.188093 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:39:09 crc kubenswrapper[4745]: I1209 12:39:09.303371 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:39:09 crc kubenswrapper[4745]: I1209 12:39:09.567290 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="673ca696-765a-47fd-a3f2-70ba1e01e9c5" path="/var/lib/kubelet/pods/673ca696-765a-47fd-a3f2-70ba1e01e9c5/volumes" Dec 09 12:39:12 crc kubenswrapper[4745]: I1209 12:39:12.644997 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nxkqb"] Dec 09 12:39:12 crc kubenswrapper[4745]: I1209 12:39:12.645237 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nxkqb" podUID="cc9d4b60-89c6-40e2-8676-861de614a1ce" containerName="registry-server" containerID="cri-o://9bc5bb3657008983e04540671316fff5f473b059bab3383315ed56955ac63b33" gracePeriod=2 Dec 09 12:39:13 crc kubenswrapper[4745]: I1209 12:39:13.293335 4745 generic.go:334] "Generic (PLEG): container finished" podID="cc9d4b60-89c6-40e2-8676-861de614a1ce" containerID="9bc5bb3657008983e04540671316fff5f473b059bab3383315ed56955ac63b33" exitCode=0 Dec 09 12:39:13 crc kubenswrapper[4745]: I1209 12:39:13.293535 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxkqb" event={"ID":"cc9d4b60-89c6-40e2-8676-861de614a1ce","Type":"ContainerDied","Data":"9bc5bb3657008983e04540671316fff5f473b059bab3383315ed56955ac63b33"} Dec 09 12:39:13 crc kubenswrapper[4745]: I1209 12:39:13.525387 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:39:13 crc kubenswrapper[4745]: I1209 12:39:13.559991 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc9d4b60-89c6-40e2-8676-861de614a1ce-catalog-content\") pod \"cc9d4b60-89c6-40e2-8676-861de614a1ce\" (UID: \"cc9d4b60-89c6-40e2-8676-861de614a1ce\") " Dec 09 12:39:13 crc kubenswrapper[4745]: I1209 12:39:13.560082 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc9d4b60-89c6-40e2-8676-861de614a1ce-utilities\") pod \"cc9d4b60-89c6-40e2-8676-861de614a1ce\" (UID: \"cc9d4b60-89c6-40e2-8676-861de614a1ce\") " Dec 09 12:39:13 crc kubenswrapper[4745]: I1209 12:39:13.560120 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whj8k\" (UniqueName: \"kubernetes.io/projected/cc9d4b60-89c6-40e2-8676-861de614a1ce-kube-api-access-whj8k\") pod \"cc9d4b60-89c6-40e2-8676-861de614a1ce\" (UID: \"cc9d4b60-89c6-40e2-8676-861de614a1ce\") " Dec 09 12:39:13 crc kubenswrapper[4745]: I1209 12:39:13.561315 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc9d4b60-89c6-40e2-8676-861de614a1ce-utilities" (OuterVolumeSpecName: "utilities") pod "cc9d4b60-89c6-40e2-8676-861de614a1ce" (UID: "cc9d4b60-89c6-40e2-8676-861de614a1ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:39:13 crc kubenswrapper[4745]: I1209 12:39:13.566117 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc9d4b60-89c6-40e2-8676-861de614a1ce-kube-api-access-whj8k" (OuterVolumeSpecName: "kube-api-access-whj8k") pod "cc9d4b60-89c6-40e2-8676-861de614a1ce" (UID: "cc9d4b60-89c6-40e2-8676-861de614a1ce"). InnerVolumeSpecName "kube-api-access-whj8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:39:13 crc kubenswrapper[4745]: I1209 12:39:13.611962 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc9d4b60-89c6-40e2-8676-861de614a1ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc9d4b60-89c6-40e2-8676-861de614a1ce" (UID: "cc9d4b60-89c6-40e2-8676-861de614a1ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:39:13 crc kubenswrapper[4745]: I1209 12:39:13.662136 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc9d4b60-89c6-40e2-8676-861de614a1ce-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:39:13 crc kubenswrapper[4745]: I1209 12:39:13.662166 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc9d4b60-89c6-40e2-8676-861de614a1ce-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:39:13 crc kubenswrapper[4745]: I1209 12:39:13.662176 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whj8k\" (UniqueName: \"kubernetes.io/projected/cc9d4b60-89c6-40e2-8676-861de614a1ce-kube-api-access-whj8k\") on node \"crc\" DevicePath \"\"" Dec 09 12:39:14 crc kubenswrapper[4745]: I1209 12:39:14.304692 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxkqb" event={"ID":"cc9d4b60-89c6-40e2-8676-861de614a1ce","Type":"ContainerDied","Data":"154e7447888f726fc7716a605c14ffe4e865b61d874401e5e5c8ec58689f6d78"} Dec 09 12:39:14 crc kubenswrapper[4745]: I1209 12:39:14.305289 4745 scope.go:117] "RemoveContainer" containerID="9bc5bb3657008983e04540671316fff5f473b059bab3383315ed56955ac63b33" Dec 09 12:39:14 crc kubenswrapper[4745]: I1209 12:39:14.304780 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nxkqb" Dec 09 12:39:14 crc kubenswrapper[4745]: I1209 12:39:14.336866 4745 scope.go:117] "RemoveContainer" containerID="911bbe1a50126300ba14e1a19ab61c06d830050e8dead49448fb5e40a400614c" Dec 09 12:39:14 crc kubenswrapper[4745]: I1209 12:39:14.346589 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nxkqb"] Dec 09 12:39:14 crc kubenswrapper[4745]: I1209 12:39:14.354544 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nxkqb"] Dec 09 12:39:14 crc kubenswrapper[4745]: I1209 12:39:14.374903 4745 scope.go:117] "RemoveContainer" containerID="17a773a759c18e25d7e6965d837020cce496387fe9cb5c6d29d416e6ac825704" Dec 09 12:39:15 crc kubenswrapper[4745]: I1209 12:39:15.571381 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc9d4b60-89c6-40e2-8676-861de614a1ce" path="/var/lib/kubelet/pods/cc9d4b60-89c6-40e2-8676-861de614a1ce/volumes" Dec 09 12:39:55 crc kubenswrapper[4745]: I1209 12:39:55.476105 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:39:55 crc kubenswrapper[4745]: I1209 12:39:55.476644 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:40:25 crc kubenswrapper[4745]: I1209 12:40:25.474893 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:40:25 crc kubenswrapper[4745]: I1209 12:40:25.475420 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:40:55 crc kubenswrapper[4745]: I1209 12:40:55.476139 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:40:55 crc kubenswrapper[4745]: I1209 12:40:55.477222 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:40:55 crc kubenswrapper[4745]: I1209 12:40:55.477280 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 12:40:55 crc kubenswrapper[4745]: I1209 12:40:55.477975 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5909a9da486347126f8bae70162a0d45dcad6ad4bfac4440ff4baddf476ff4a6"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:40:55 crc kubenswrapper[4745]: I1209 12:40:55.478026 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://5909a9da486347126f8bae70162a0d45dcad6ad4bfac4440ff4baddf476ff4a6" gracePeriod=600 Dec 09 12:40:56 crc kubenswrapper[4745]: I1209 12:40:56.112303 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="5909a9da486347126f8bae70162a0d45dcad6ad4bfac4440ff4baddf476ff4a6" exitCode=0 Dec 09 12:40:56 crc kubenswrapper[4745]: I1209 12:40:56.112357 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"5909a9da486347126f8bae70162a0d45dcad6ad4bfac4440ff4baddf476ff4a6"} Dec 09 12:40:56 crc kubenswrapper[4745]: I1209 12:40:56.112390 4745 scope.go:117] "RemoveContainer" containerID="4e8ef189dc4f0bf70fa6e3fcdb8a52285c542424eb54946e3b96c3a6d1b7916e" Dec 09 12:40:57 crc kubenswrapper[4745]: I1209 12:40:57.124895 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865"} Dec 09 12:43:25 crc kubenswrapper[4745]: I1209 12:43:25.475618 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:43:25 crc kubenswrapper[4745]: I1209 12:43:25.476279 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:43:55 crc kubenswrapper[4745]: I1209 12:43:55.475430 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:43:55 crc kubenswrapper[4745]: I1209 12:43:55.476122 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.525112 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pr6mc"] Dec 09 12:44:16 crc kubenswrapper[4745]: E1209 12:44:16.528063 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="673ca696-765a-47fd-a3f2-70ba1e01e9c5" containerName="extract-content" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.528139 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="673ca696-765a-47fd-a3f2-70ba1e01e9c5" containerName="extract-content" Dec 09 12:44:16 crc kubenswrapper[4745]: E1209 12:44:16.528166 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc9d4b60-89c6-40e2-8676-861de614a1ce" containerName="extract-utilities" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.528175 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc9d4b60-89c6-40e2-8676-861de614a1ce" containerName="extract-utilities" Dec 09 12:44:16 crc kubenswrapper[4745]: E1209 12:44:16.528186 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc9d4b60-89c6-40e2-8676-861de614a1ce" containerName="registry-server" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.528194 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc9d4b60-89c6-40e2-8676-861de614a1ce" containerName="registry-server" Dec 09 12:44:16 crc kubenswrapper[4745]: E1209 12:44:16.528207 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="673ca696-765a-47fd-a3f2-70ba1e01e9c5" containerName="extract-utilities" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.528215 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="673ca696-765a-47fd-a3f2-70ba1e01e9c5" containerName="extract-utilities" Dec 09 12:44:16 crc kubenswrapper[4745]: E1209 12:44:16.528231 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="673ca696-765a-47fd-a3f2-70ba1e01e9c5" containerName="registry-server" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.528238 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="673ca696-765a-47fd-a3f2-70ba1e01e9c5" containerName="registry-server" Dec 09 12:44:16 crc kubenswrapper[4745]: E1209 12:44:16.528261 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc9d4b60-89c6-40e2-8676-861de614a1ce" containerName="extract-content" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.528271 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc9d4b60-89c6-40e2-8676-861de614a1ce" containerName="extract-content" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.528484 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="673ca696-765a-47fd-a3f2-70ba1e01e9c5" containerName="registry-server" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.528536 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc9d4b60-89c6-40e2-8676-861de614a1ce" containerName="registry-server" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.530014 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.531454 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pr6mc"] Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.675593 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9s4g\" (UniqueName: \"kubernetes.io/projected/7d700554-dde9-4b11-a03a-3765fa1f5b11-kube-api-access-j9s4g\") pod \"redhat-marketplace-pr6mc\" (UID: \"7d700554-dde9-4b11-a03a-3765fa1f5b11\") " pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.675679 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d700554-dde9-4b11-a03a-3765fa1f5b11-catalog-content\") pod \"redhat-marketplace-pr6mc\" (UID: \"7d700554-dde9-4b11-a03a-3765fa1f5b11\") " pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.675712 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d700554-dde9-4b11-a03a-3765fa1f5b11-utilities\") pod \"redhat-marketplace-pr6mc\" (UID: \"7d700554-dde9-4b11-a03a-3765fa1f5b11\") " pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.777127 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d700554-dde9-4b11-a03a-3765fa1f5b11-catalog-content\") pod \"redhat-marketplace-pr6mc\" (UID: \"7d700554-dde9-4b11-a03a-3765fa1f5b11\") " pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.777180 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d700554-dde9-4b11-a03a-3765fa1f5b11-utilities\") pod \"redhat-marketplace-pr6mc\" (UID: \"7d700554-dde9-4b11-a03a-3765fa1f5b11\") " pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.777260 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9s4g\" (UniqueName: \"kubernetes.io/projected/7d700554-dde9-4b11-a03a-3765fa1f5b11-kube-api-access-j9s4g\") pod \"redhat-marketplace-pr6mc\" (UID: \"7d700554-dde9-4b11-a03a-3765fa1f5b11\") " pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.777752 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d700554-dde9-4b11-a03a-3765fa1f5b11-catalog-content\") pod \"redhat-marketplace-pr6mc\" (UID: \"7d700554-dde9-4b11-a03a-3765fa1f5b11\") " pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.777844 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d700554-dde9-4b11-a03a-3765fa1f5b11-utilities\") pod \"redhat-marketplace-pr6mc\" (UID: \"7d700554-dde9-4b11-a03a-3765fa1f5b11\") " pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.799783 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9s4g\" (UniqueName: \"kubernetes.io/projected/7d700554-dde9-4b11-a03a-3765fa1f5b11-kube-api-access-j9s4g\") pod \"redhat-marketplace-pr6mc\" (UID: \"7d700554-dde9-4b11-a03a-3765fa1f5b11\") " pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:16 crc kubenswrapper[4745]: I1209 12:44:16.859692 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:17 crc kubenswrapper[4745]: I1209 12:44:17.306327 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pr6mc"] Dec 09 12:44:17 crc kubenswrapper[4745]: I1209 12:44:17.588895 4745 generic.go:334] "Generic (PLEG): container finished" podID="7d700554-dde9-4b11-a03a-3765fa1f5b11" containerID="f096e6942a3aa4422581f0d713cb79aa8f3c48eccdf8a312f40f39640c1c9254" exitCode=0 Dec 09 12:44:17 crc kubenswrapper[4745]: I1209 12:44:17.588938 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6mc" event={"ID":"7d700554-dde9-4b11-a03a-3765fa1f5b11","Type":"ContainerDied","Data":"f096e6942a3aa4422581f0d713cb79aa8f3c48eccdf8a312f40f39640c1c9254"} Dec 09 12:44:17 crc kubenswrapper[4745]: I1209 12:44:17.588963 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6mc" event={"ID":"7d700554-dde9-4b11-a03a-3765fa1f5b11","Type":"ContainerStarted","Data":"274671a957a54382aef26a00575581e76403a1494e62281d40cfcdfd2664deb1"} Dec 09 12:44:17 crc kubenswrapper[4745]: I1209 12:44:17.591616 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:44:18 crc kubenswrapper[4745]: I1209 12:44:18.597767 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6mc" event={"ID":"7d700554-dde9-4b11-a03a-3765fa1f5b11","Type":"ContainerStarted","Data":"113006b1f83157c6fbcccd875188ef7f59303ec98b2bf2e6cc4eeeb77a44331b"} Dec 09 12:44:19 crc kubenswrapper[4745]: I1209 12:44:19.606396 4745 generic.go:334] "Generic (PLEG): container finished" podID="7d700554-dde9-4b11-a03a-3765fa1f5b11" containerID="113006b1f83157c6fbcccd875188ef7f59303ec98b2bf2e6cc4eeeb77a44331b" exitCode=0 Dec 09 12:44:19 crc kubenswrapper[4745]: I1209 12:44:19.606460 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6mc" event={"ID":"7d700554-dde9-4b11-a03a-3765fa1f5b11","Type":"ContainerDied","Data":"113006b1f83157c6fbcccd875188ef7f59303ec98b2bf2e6cc4eeeb77a44331b"} Dec 09 12:44:21 crc kubenswrapper[4745]: I1209 12:44:21.634426 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6mc" event={"ID":"7d700554-dde9-4b11-a03a-3765fa1f5b11","Type":"ContainerStarted","Data":"7697be0a46d2a6e181a92bfcaf9e5ab6974ad478d7367010abbce79186a28624"} Dec 09 12:44:21 crc kubenswrapper[4745]: I1209 12:44:21.650886 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pr6mc" podStartSLOduration=3.140363339 podStartE2EDuration="5.650866143s" podCreationTimestamp="2025-12-09 12:44:16 +0000 UTC" firstStartedPulling="2025-12-09 12:44:17.590174223 +0000 UTC m=+4344.415375747" lastFinishedPulling="2025-12-09 12:44:20.100677027 +0000 UTC m=+4346.925878551" observedRunningTime="2025-12-09 12:44:21.649243929 +0000 UTC m=+4348.474445473" watchObservedRunningTime="2025-12-09 12:44:21.650866143 +0000 UTC m=+4348.476067667" Dec 09 12:44:25 crc kubenswrapper[4745]: I1209 12:44:25.475063 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:44:25 crc kubenswrapper[4745]: I1209 12:44:25.475642 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:44:25 crc kubenswrapper[4745]: I1209 12:44:25.475683 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 12:44:25 crc kubenswrapper[4745]: I1209 12:44:25.476225 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:44:25 crc kubenswrapper[4745]: I1209 12:44:25.476272 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" gracePeriod=600 Dec 09 12:44:26 crc kubenswrapper[4745]: E1209 12:44:26.097058 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:44:26 crc kubenswrapper[4745]: I1209 12:44:26.666982 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" exitCode=0 Dec 09 12:44:26 crc kubenswrapper[4745]: I1209 12:44:26.667032 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865"} Dec 09 12:44:26 crc kubenswrapper[4745]: I1209 12:44:26.667096 4745 scope.go:117] "RemoveContainer" containerID="5909a9da486347126f8bae70162a0d45dcad6ad4bfac4440ff4baddf476ff4a6" Dec 09 12:44:26 crc kubenswrapper[4745]: I1209 12:44:26.667584 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:44:26 crc kubenswrapper[4745]: E1209 12:44:26.667773 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:44:26 crc kubenswrapper[4745]: I1209 12:44:26.860031 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:26 crc kubenswrapper[4745]: I1209 12:44:26.860362 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:26 crc kubenswrapper[4745]: I1209 12:44:26.908245 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:27 crc kubenswrapper[4745]: I1209 12:44:27.729966 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:27 crc kubenswrapper[4745]: I1209 12:44:27.776595 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pr6mc"] Dec 09 12:44:29 crc kubenswrapper[4745]: I1209 12:44:29.694237 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pr6mc" podUID="7d700554-dde9-4b11-a03a-3765fa1f5b11" containerName="registry-server" containerID="cri-o://7697be0a46d2a6e181a92bfcaf9e5ab6974ad478d7367010abbce79186a28624" gracePeriod=2 Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.074653 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.214461 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d700554-dde9-4b11-a03a-3765fa1f5b11-catalog-content\") pod \"7d700554-dde9-4b11-a03a-3765fa1f5b11\" (UID: \"7d700554-dde9-4b11-a03a-3765fa1f5b11\") " Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.214587 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d700554-dde9-4b11-a03a-3765fa1f5b11-utilities\") pod \"7d700554-dde9-4b11-a03a-3765fa1f5b11\" (UID: \"7d700554-dde9-4b11-a03a-3765fa1f5b11\") " Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.214717 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9s4g\" (UniqueName: \"kubernetes.io/projected/7d700554-dde9-4b11-a03a-3765fa1f5b11-kube-api-access-j9s4g\") pod \"7d700554-dde9-4b11-a03a-3765fa1f5b11\" (UID: \"7d700554-dde9-4b11-a03a-3765fa1f5b11\") " Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.216457 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d700554-dde9-4b11-a03a-3765fa1f5b11-utilities" (OuterVolumeSpecName: "utilities") pod "7d700554-dde9-4b11-a03a-3765fa1f5b11" (UID: "7d700554-dde9-4b11-a03a-3765fa1f5b11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.222932 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d700554-dde9-4b11-a03a-3765fa1f5b11-kube-api-access-j9s4g" (OuterVolumeSpecName: "kube-api-access-j9s4g") pod "7d700554-dde9-4b11-a03a-3765fa1f5b11" (UID: "7d700554-dde9-4b11-a03a-3765fa1f5b11"). InnerVolumeSpecName "kube-api-access-j9s4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.236391 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d700554-dde9-4b11-a03a-3765fa1f5b11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d700554-dde9-4b11-a03a-3765fa1f5b11" (UID: "7d700554-dde9-4b11-a03a-3765fa1f5b11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.316197 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d700554-dde9-4b11-a03a-3765fa1f5b11-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.316239 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d700554-dde9-4b11-a03a-3765fa1f5b11-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.316250 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9s4g\" (UniqueName: \"kubernetes.io/projected/7d700554-dde9-4b11-a03a-3765fa1f5b11-kube-api-access-j9s4g\") on node \"crc\" DevicePath \"\"" Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.704715 4745 generic.go:334] "Generic (PLEG): container finished" podID="7d700554-dde9-4b11-a03a-3765fa1f5b11" containerID="7697be0a46d2a6e181a92bfcaf9e5ab6974ad478d7367010abbce79186a28624" exitCode=0 Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.704807 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6mc" event={"ID":"7d700554-dde9-4b11-a03a-3765fa1f5b11","Type":"ContainerDied","Data":"7697be0a46d2a6e181a92bfcaf9e5ab6974ad478d7367010abbce79186a28624"} Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.704815 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pr6mc" Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.704870 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6mc" event={"ID":"7d700554-dde9-4b11-a03a-3765fa1f5b11","Type":"ContainerDied","Data":"274671a957a54382aef26a00575581e76403a1494e62281d40cfcdfd2664deb1"} Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.704907 4745 scope.go:117] "RemoveContainer" containerID="7697be0a46d2a6e181a92bfcaf9e5ab6974ad478d7367010abbce79186a28624" Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.732670 4745 scope.go:117] "RemoveContainer" containerID="113006b1f83157c6fbcccd875188ef7f59303ec98b2bf2e6cc4eeeb77a44331b" Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.749084 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pr6mc"] Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.758109 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pr6mc"] Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.864755 4745 scope.go:117] "RemoveContainer" containerID="f096e6942a3aa4422581f0d713cb79aa8f3c48eccdf8a312f40f39640c1c9254" Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.890597 4745 scope.go:117] "RemoveContainer" containerID="7697be0a46d2a6e181a92bfcaf9e5ab6974ad478d7367010abbce79186a28624" Dec 09 12:44:30 crc kubenswrapper[4745]: E1209 12:44:30.891205 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7697be0a46d2a6e181a92bfcaf9e5ab6974ad478d7367010abbce79186a28624\": container with ID starting with 7697be0a46d2a6e181a92bfcaf9e5ab6974ad478d7367010abbce79186a28624 not found: ID does not exist" containerID="7697be0a46d2a6e181a92bfcaf9e5ab6974ad478d7367010abbce79186a28624" Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.891257 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7697be0a46d2a6e181a92bfcaf9e5ab6974ad478d7367010abbce79186a28624"} err="failed to get container status \"7697be0a46d2a6e181a92bfcaf9e5ab6974ad478d7367010abbce79186a28624\": rpc error: code = NotFound desc = could not find container \"7697be0a46d2a6e181a92bfcaf9e5ab6974ad478d7367010abbce79186a28624\": container with ID starting with 7697be0a46d2a6e181a92bfcaf9e5ab6974ad478d7367010abbce79186a28624 not found: ID does not exist" Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.891289 4745 scope.go:117] "RemoveContainer" containerID="113006b1f83157c6fbcccd875188ef7f59303ec98b2bf2e6cc4eeeb77a44331b" Dec 09 12:44:30 crc kubenswrapper[4745]: E1209 12:44:30.891716 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"113006b1f83157c6fbcccd875188ef7f59303ec98b2bf2e6cc4eeeb77a44331b\": container with ID starting with 113006b1f83157c6fbcccd875188ef7f59303ec98b2bf2e6cc4eeeb77a44331b not found: ID does not exist" containerID="113006b1f83157c6fbcccd875188ef7f59303ec98b2bf2e6cc4eeeb77a44331b" Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.891752 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"113006b1f83157c6fbcccd875188ef7f59303ec98b2bf2e6cc4eeeb77a44331b"} err="failed to get container status \"113006b1f83157c6fbcccd875188ef7f59303ec98b2bf2e6cc4eeeb77a44331b\": rpc error: code = NotFound desc = could not find container \"113006b1f83157c6fbcccd875188ef7f59303ec98b2bf2e6cc4eeeb77a44331b\": container with ID starting with 113006b1f83157c6fbcccd875188ef7f59303ec98b2bf2e6cc4eeeb77a44331b not found: ID does not exist" Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.891789 4745 scope.go:117] "RemoveContainer" containerID="f096e6942a3aa4422581f0d713cb79aa8f3c48eccdf8a312f40f39640c1c9254" Dec 09 12:44:30 crc kubenswrapper[4745]: E1209 12:44:30.892165 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f096e6942a3aa4422581f0d713cb79aa8f3c48eccdf8a312f40f39640c1c9254\": container with ID starting with f096e6942a3aa4422581f0d713cb79aa8f3c48eccdf8a312f40f39640c1c9254 not found: ID does not exist" containerID="f096e6942a3aa4422581f0d713cb79aa8f3c48eccdf8a312f40f39640c1c9254" Dec 09 12:44:30 crc kubenswrapper[4745]: I1209 12:44:30.892194 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f096e6942a3aa4422581f0d713cb79aa8f3c48eccdf8a312f40f39640c1c9254"} err="failed to get container status \"f096e6942a3aa4422581f0d713cb79aa8f3c48eccdf8a312f40f39640c1c9254\": rpc error: code = NotFound desc = could not find container \"f096e6942a3aa4422581f0d713cb79aa8f3c48eccdf8a312f40f39640c1c9254\": container with ID starting with f096e6942a3aa4422581f0d713cb79aa8f3c48eccdf8a312f40f39640c1c9254 not found: ID does not exist" Dec 09 12:44:31 crc kubenswrapper[4745]: I1209 12:44:31.575793 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d700554-dde9-4b11-a03a-3765fa1f5b11" path="/var/lib/kubelet/pods/7d700554-dde9-4b11-a03a-3765fa1f5b11/volumes" Dec 09 12:44:39 crc kubenswrapper[4745]: I1209 12:44:39.555057 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:44:39 crc kubenswrapper[4745]: E1209 12:44:39.555762 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:44:51 crc kubenswrapper[4745]: I1209 12:44:51.555323 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:44:51 crc kubenswrapper[4745]: E1209 12:44:51.556166 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.178179 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2"] Dec 09 12:45:00 crc kubenswrapper[4745]: E1209 12:45:00.179691 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d700554-dde9-4b11-a03a-3765fa1f5b11" containerName="registry-server" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.179711 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d700554-dde9-4b11-a03a-3765fa1f5b11" containerName="registry-server" Dec 09 12:45:00 crc kubenswrapper[4745]: E1209 12:45:00.179727 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d700554-dde9-4b11-a03a-3765fa1f5b11" containerName="extract-content" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.179733 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d700554-dde9-4b11-a03a-3765fa1f5b11" containerName="extract-content" Dec 09 12:45:00 crc kubenswrapper[4745]: E1209 12:45:00.179759 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d700554-dde9-4b11-a03a-3765fa1f5b11" containerName="extract-utilities" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.179771 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d700554-dde9-4b11-a03a-3765fa1f5b11" containerName="extract-utilities" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.179961 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d700554-dde9-4b11-a03a-3765fa1f5b11" containerName="registry-server" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.180890 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.183822 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.183825 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.199580 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2"] Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.236225 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-config-volume\") pod \"collect-profiles-29421405-s5wc2\" (UID: \"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.236320 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-secret-volume\") pod \"collect-profiles-29421405-s5wc2\" (UID: \"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.236691 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tfj7\" (UniqueName: \"kubernetes.io/projected/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-kube-api-access-8tfj7\") pod \"collect-profiles-29421405-s5wc2\" (UID: \"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.338167 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-config-volume\") pod \"collect-profiles-29421405-s5wc2\" (UID: \"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.338221 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-secret-volume\") pod \"collect-profiles-29421405-s5wc2\" (UID: \"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.338311 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tfj7\" (UniqueName: \"kubernetes.io/projected/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-kube-api-access-8tfj7\") pod \"collect-profiles-29421405-s5wc2\" (UID: \"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.339500 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-config-volume\") pod \"collect-profiles-29421405-s5wc2\" (UID: \"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.344762 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-secret-volume\") pod \"collect-profiles-29421405-s5wc2\" (UID: \"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.359159 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tfj7\" (UniqueName: \"kubernetes.io/projected/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-kube-api-access-8tfj7\") pod \"collect-profiles-29421405-s5wc2\" (UID: \"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.503111 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2" Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.913542 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2"] Dec 09 12:45:00 crc kubenswrapper[4745]: I1209 12:45:00.927203 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2" event={"ID":"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc","Type":"ContainerStarted","Data":"956965a7d76b599fdbdc219e69a2b8b83c0893abd31a7e6c16cf7f94e12afab1"} Dec 09 12:45:01 crc kubenswrapper[4745]: I1209 12:45:01.935063 4745 generic.go:334] "Generic (PLEG): container finished" podID="bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc" containerID="6a0c0dde81910c908ed11f0db17bf79a47d1f26f84ddc56822740d81e867ba2a" exitCode=0 Dec 09 12:45:01 crc kubenswrapper[4745]: I1209 12:45:01.935111 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2" event={"ID":"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc","Type":"ContainerDied","Data":"6a0c0dde81910c908ed11f0db17bf79a47d1f26f84ddc56822740d81e867ba2a"} Dec 09 12:45:02 crc kubenswrapper[4745]: I1209 12:45:02.554241 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:45:02 crc kubenswrapper[4745]: E1209 12:45:02.554494 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:45:03 crc kubenswrapper[4745]: I1209 12:45:03.206591 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2" Dec 09 12:45:03 crc kubenswrapper[4745]: I1209 12:45:03.279132 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-secret-volume\") pod \"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc\" (UID: \"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc\") " Dec 09 12:45:03 crc kubenswrapper[4745]: I1209 12:45:03.279204 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-config-volume\") pod \"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc\" (UID: \"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc\") " Dec 09 12:45:03 crc kubenswrapper[4745]: I1209 12:45:03.279335 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tfj7\" (UniqueName: \"kubernetes.io/projected/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-kube-api-access-8tfj7\") pod \"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc\" (UID: \"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc\") " Dec 09 12:45:03 crc kubenswrapper[4745]: I1209 12:45:03.280222 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-config-volume" (OuterVolumeSpecName: "config-volume") pod "bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc" (UID: "bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 12:45:03 crc kubenswrapper[4745]: I1209 12:45:03.284195 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-kube-api-access-8tfj7" (OuterVolumeSpecName: "kube-api-access-8tfj7") pod "bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc" (UID: "bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc"). InnerVolumeSpecName "kube-api-access-8tfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:45:03 crc kubenswrapper[4745]: I1209 12:45:03.284412 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc" (UID: "bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 12:45:03 crc kubenswrapper[4745]: I1209 12:45:03.380911 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tfj7\" (UniqueName: \"kubernetes.io/projected/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-kube-api-access-8tfj7\") on node \"crc\" DevicePath \"\"" Dec 09 12:45:03 crc kubenswrapper[4745]: I1209 12:45:03.380962 4745 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 12:45:03 crc kubenswrapper[4745]: I1209 12:45:03.380971 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 12:45:03 crc kubenswrapper[4745]: I1209 12:45:03.949239 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2" event={"ID":"bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc","Type":"ContainerDied","Data":"956965a7d76b599fdbdc219e69a2b8b83c0893abd31a7e6c16cf7f94e12afab1"} Dec 09 12:45:03 crc kubenswrapper[4745]: I1209 12:45:03.949581 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="956965a7d76b599fdbdc219e69a2b8b83c0893abd31a7e6c16cf7f94e12afab1" Dec 09 12:45:03 crc kubenswrapper[4745]: I1209 12:45:03.949291 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421405-s5wc2" Dec 09 12:45:04 crc kubenswrapper[4745]: I1209 12:45:04.278664 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh"] Dec 09 12:45:04 crc kubenswrapper[4745]: I1209 12:45:04.302597 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421360-4t5hh"] Dec 09 12:45:05 crc kubenswrapper[4745]: I1209 12:45:05.562830 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="727fb6c9-9336-4c44-8ad9-d44cd6d6da60" path="/var/lib/kubelet/pods/727fb6c9-9336-4c44-8ad9-d44cd6d6da60/volumes" Dec 09 12:45:06 crc kubenswrapper[4745]: I1209 12:45:06.165433 4745 scope.go:117] "RemoveContainer" containerID="b527afeda634a1cf415e0da35f7a7204a71c2a4a0a2f1c3a308b81fcea7a0e35" Dec 09 12:45:16 crc kubenswrapper[4745]: I1209 12:45:16.554847 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:45:16 crc kubenswrapper[4745]: E1209 12:45:16.555618 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:45:29 crc kubenswrapper[4745]: I1209 12:45:29.554530 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:45:29 crc kubenswrapper[4745]: E1209 12:45:29.555188 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:45:41 crc kubenswrapper[4745]: I1209 12:45:41.554999 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:45:41 crc kubenswrapper[4745]: E1209 12:45:41.555804 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:45:52 crc kubenswrapper[4745]: I1209 12:45:52.555317 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:45:52 crc kubenswrapper[4745]: E1209 12:45:52.556206 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:46:04 crc kubenswrapper[4745]: I1209 12:46:04.555099 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:46:04 crc kubenswrapper[4745]: E1209 12:46:04.555874 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:46:18 crc kubenswrapper[4745]: I1209 12:46:18.555318 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:46:18 crc kubenswrapper[4745]: E1209 12:46:18.556052 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:46:33 crc kubenswrapper[4745]: I1209 12:46:33.565997 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:46:33 crc kubenswrapper[4745]: E1209 12:46:33.567129 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:46:47 crc kubenswrapper[4745]: I1209 12:46:47.559179 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:46:47 crc kubenswrapper[4745]: E1209 12:46:47.560129 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:46:59 crc kubenswrapper[4745]: I1209 12:46:59.555120 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:46:59 crc kubenswrapper[4745]: E1209 12:46:59.555915 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:47:11 crc kubenswrapper[4745]: I1209 12:47:11.556695 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:47:11 crc kubenswrapper[4745]: E1209 12:47:11.558175 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:47:25 crc kubenswrapper[4745]: I1209 12:47:25.555181 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:47:25 crc kubenswrapper[4745]: E1209 12:47:25.555983 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:47:37 crc kubenswrapper[4745]: I1209 12:47:37.555023 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:47:37 crc kubenswrapper[4745]: E1209 12:47:37.555762 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:47:51 crc kubenswrapper[4745]: I1209 12:47:51.555668 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:47:51 crc kubenswrapper[4745]: E1209 12:47:51.556641 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:48:05 crc kubenswrapper[4745]: I1209 12:48:05.555084 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:48:05 crc kubenswrapper[4745]: E1209 12:48:05.556936 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:48:19 crc kubenswrapper[4745]: I1209 12:48:19.555245 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:48:19 crc kubenswrapper[4745]: E1209 12:48:19.555979 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:48:32 crc kubenswrapper[4745]: I1209 12:48:32.555430 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:48:32 crc kubenswrapper[4745]: E1209 12:48:32.556300 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:48:32 crc kubenswrapper[4745]: I1209 12:48:32.834814 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4vgtf"] Dec 09 12:48:32 crc kubenswrapper[4745]: E1209 12:48:32.835217 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc" containerName="collect-profiles" Dec 09 12:48:32 crc kubenswrapper[4745]: I1209 12:48:32.835251 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc" containerName="collect-profiles" Dec 09 12:48:32 crc kubenswrapper[4745]: I1209 12:48:32.835478 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdd20b5a-7e6f-408d-af9b-dc6eda8d07fc" containerName="collect-profiles" Dec 09 12:48:32 crc kubenswrapper[4745]: I1209 12:48:32.838405 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:32 crc kubenswrapper[4745]: I1209 12:48:32.847731 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4vgtf"] Dec 09 12:48:32 crc kubenswrapper[4745]: I1209 12:48:32.849700 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcnmp\" (UniqueName: \"kubernetes.io/projected/2eea0dce-cbc6-4dbb-885c-3f353245555d-kube-api-access-mcnmp\") pod \"redhat-operators-4vgtf\" (UID: \"2eea0dce-cbc6-4dbb-885c-3f353245555d\") " pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:32 crc kubenswrapper[4745]: I1209 12:48:32.849757 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eea0dce-cbc6-4dbb-885c-3f353245555d-utilities\") pod \"redhat-operators-4vgtf\" (UID: \"2eea0dce-cbc6-4dbb-885c-3f353245555d\") " pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:32 crc kubenswrapper[4745]: I1209 12:48:32.849781 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eea0dce-cbc6-4dbb-885c-3f353245555d-catalog-content\") pod \"redhat-operators-4vgtf\" (UID: \"2eea0dce-cbc6-4dbb-885c-3f353245555d\") " pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:32 crc kubenswrapper[4745]: I1209 12:48:32.951292 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eea0dce-cbc6-4dbb-885c-3f353245555d-utilities\") pod \"redhat-operators-4vgtf\" (UID: \"2eea0dce-cbc6-4dbb-885c-3f353245555d\") " pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:32 crc kubenswrapper[4745]: I1209 12:48:32.951349 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eea0dce-cbc6-4dbb-885c-3f353245555d-catalog-content\") pod \"redhat-operators-4vgtf\" (UID: \"2eea0dce-cbc6-4dbb-885c-3f353245555d\") " pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:32 crc kubenswrapper[4745]: I1209 12:48:32.951432 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcnmp\" (UniqueName: \"kubernetes.io/projected/2eea0dce-cbc6-4dbb-885c-3f353245555d-kube-api-access-mcnmp\") pod \"redhat-operators-4vgtf\" (UID: \"2eea0dce-cbc6-4dbb-885c-3f353245555d\") " pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:32 crc kubenswrapper[4745]: I1209 12:48:32.952029 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eea0dce-cbc6-4dbb-885c-3f353245555d-catalog-content\") pod \"redhat-operators-4vgtf\" (UID: \"2eea0dce-cbc6-4dbb-885c-3f353245555d\") " pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:32 crc kubenswrapper[4745]: I1209 12:48:32.952005 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eea0dce-cbc6-4dbb-885c-3f353245555d-utilities\") pod \"redhat-operators-4vgtf\" (UID: \"2eea0dce-cbc6-4dbb-885c-3f353245555d\") " pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:32 crc kubenswrapper[4745]: I1209 12:48:32.972763 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcnmp\" (UniqueName: \"kubernetes.io/projected/2eea0dce-cbc6-4dbb-885c-3f353245555d-kube-api-access-mcnmp\") pod \"redhat-operators-4vgtf\" (UID: \"2eea0dce-cbc6-4dbb-885c-3f353245555d\") " pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:33 crc kubenswrapper[4745]: I1209 12:48:33.191457 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:33 crc kubenswrapper[4745]: I1209 12:48:33.623093 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4vgtf"] Dec 09 12:48:34 crc kubenswrapper[4745]: I1209 12:48:34.427591 4745 generic.go:334] "Generic (PLEG): container finished" podID="2eea0dce-cbc6-4dbb-885c-3f353245555d" containerID="bb5032ecd5ed033f9a9dbf67e20a6ddb309485afe96c2067718c95883ec9f5ab" exitCode=0 Dec 09 12:48:34 crc kubenswrapper[4745]: I1209 12:48:34.427645 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4vgtf" event={"ID":"2eea0dce-cbc6-4dbb-885c-3f353245555d","Type":"ContainerDied","Data":"bb5032ecd5ed033f9a9dbf67e20a6ddb309485afe96c2067718c95883ec9f5ab"} Dec 09 12:48:34 crc kubenswrapper[4745]: I1209 12:48:34.427870 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4vgtf" event={"ID":"2eea0dce-cbc6-4dbb-885c-3f353245555d","Type":"ContainerStarted","Data":"34505240624f4f3b3b00d66122d4bb6605187ec2d5ad23ba72acfac7f996398d"} Dec 09 12:48:35 crc kubenswrapper[4745]: I1209 12:48:35.436736 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4vgtf" event={"ID":"2eea0dce-cbc6-4dbb-885c-3f353245555d","Type":"ContainerStarted","Data":"ce242089acdedead29349039e0bfc345449d06655e9d39721fc504d4b0310c80"} Dec 09 12:48:36 crc kubenswrapper[4745]: I1209 12:48:36.450071 4745 generic.go:334] "Generic (PLEG): container finished" podID="2eea0dce-cbc6-4dbb-885c-3f353245555d" containerID="ce242089acdedead29349039e0bfc345449d06655e9d39721fc504d4b0310c80" exitCode=0 Dec 09 12:48:36 crc kubenswrapper[4745]: I1209 12:48:36.450256 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4vgtf" event={"ID":"2eea0dce-cbc6-4dbb-885c-3f353245555d","Type":"ContainerDied","Data":"ce242089acdedead29349039e0bfc345449d06655e9d39721fc504d4b0310c80"} Dec 09 12:48:37 crc kubenswrapper[4745]: I1209 12:48:37.459166 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4vgtf" event={"ID":"2eea0dce-cbc6-4dbb-885c-3f353245555d","Type":"ContainerStarted","Data":"16e0c4c0786b966e7e0a2496d0e84f8fef3ccca97e73aefa7757501f20da701b"} Dec 09 12:48:37 crc kubenswrapper[4745]: I1209 12:48:37.478594 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4vgtf" podStartSLOduration=3.071216241 podStartE2EDuration="5.478571156s" podCreationTimestamp="2025-12-09 12:48:32 +0000 UTC" firstStartedPulling="2025-12-09 12:48:34.42994544 +0000 UTC m=+4601.255146964" lastFinishedPulling="2025-12-09 12:48:36.837300355 +0000 UTC m=+4603.662501879" observedRunningTime="2025-12-09 12:48:37.474041364 +0000 UTC m=+4604.299242888" watchObservedRunningTime="2025-12-09 12:48:37.478571156 +0000 UTC m=+4604.303772680" Dec 09 12:48:43 crc kubenswrapper[4745]: I1209 12:48:43.192171 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:43 crc kubenswrapper[4745]: I1209 12:48:43.192827 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:43 crc kubenswrapper[4745]: I1209 12:48:43.234958 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:43 crc kubenswrapper[4745]: I1209 12:48:43.538847 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:43 crc kubenswrapper[4745]: I1209 12:48:43.599993 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4vgtf"] Dec 09 12:48:45 crc kubenswrapper[4745]: I1209 12:48:45.506908 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4vgtf" podUID="2eea0dce-cbc6-4dbb-885c-3f353245555d" containerName="registry-server" containerID="cri-o://16e0c4c0786b966e7e0a2496d0e84f8fef3ccca97e73aefa7757501f20da701b" gracePeriod=2 Dec 09 12:48:45 crc kubenswrapper[4745]: I1209 12:48:45.555080 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:48:45 crc kubenswrapper[4745]: E1209 12:48:45.555322 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:48:48 crc kubenswrapper[4745]: I1209 12:48:48.547266 4745 generic.go:334] "Generic (PLEG): container finished" podID="2eea0dce-cbc6-4dbb-885c-3f353245555d" containerID="16e0c4c0786b966e7e0a2496d0e84f8fef3ccca97e73aefa7757501f20da701b" exitCode=0 Dec 09 12:48:48 crc kubenswrapper[4745]: I1209 12:48:48.547359 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4vgtf" event={"ID":"2eea0dce-cbc6-4dbb-885c-3f353245555d","Type":"ContainerDied","Data":"16e0c4c0786b966e7e0a2496d0e84f8fef3ccca97e73aefa7757501f20da701b"} Dec 09 12:48:48 crc kubenswrapper[4745]: I1209 12:48:48.747716 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:48 crc kubenswrapper[4745]: I1209 12:48:48.882117 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eea0dce-cbc6-4dbb-885c-3f353245555d-catalog-content\") pod \"2eea0dce-cbc6-4dbb-885c-3f353245555d\" (UID: \"2eea0dce-cbc6-4dbb-885c-3f353245555d\") " Dec 09 12:48:48 crc kubenswrapper[4745]: I1209 12:48:48.882248 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eea0dce-cbc6-4dbb-885c-3f353245555d-utilities\") pod \"2eea0dce-cbc6-4dbb-885c-3f353245555d\" (UID: \"2eea0dce-cbc6-4dbb-885c-3f353245555d\") " Dec 09 12:48:48 crc kubenswrapper[4745]: I1209 12:48:48.882335 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcnmp\" (UniqueName: \"kubernetes.io/projected/2eea0dce-cbc6-4dbb-885c-3f353245555d-kube-api-access-mcnmp\") pod \"2eea0dce-cbc6-4dbb-885c-3f353245555d\" (UID: \"2eea0dce-cbc6-4dbb-885c-3f353245555d\") " Dec 09 12:48:48 crc kubenswrapper[4745]: I1209 12:48:48.883188 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eea0dce-cbc6-4dbb-885c-3f353245555d-utilities" (OuterVolumeSpecName: "utilities") pod "2eea0dce-cbc6-4dbb-885c-3f353245555d" (UID: "2eea0dce-cbc6-4dbb-885c-3f353245555d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:48:48 crc kubenswrapper[4745]: I1209 12:48:48.953713 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eea0dce-cbc6-4dbb-885c-3f353245555d-kube-api-access-mcnmp" (OuterVolumeSpecName: "kube-api-access-mcnmp") pod "2eea0dce-cbc6-4dbb-885c-3f353245555d" (UID: "2eea0dce-cbc6-4dbb-885c-3f353245555d"). InnerVolumeSpecName "kube-api-access-mcnmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:48:48 crc kubenswrapper[4745]: I1209 12:48:48.985759 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eea0dce-cbc6-4dbb-885c-3f353245555d-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:48:48 crc kubenswrapper[4745]: I1209 12:48:48.985815 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcnmp\" (UniqueName: \"kubernetes.io/projected/2eea0dce-cbc6-4dbb-885c-3f353245555d-kube-api-access-mcnmp\") on node \"crc\" DevicePath \"\"" Dec 09 12:48:49 crc kubenswrapper[4745]: I1209 12:48:49.000836 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eea0dce-cbc6-4dbb-885c-3f353245555d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2eea0dce-cbc6-4dbb-885c-3f353245555d" (UID: "2eea0dce-cbc6-4dbb-885c-3f353245555d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:48:49 crc kubenswrapper[4745]: I1209 12:48:49.087745 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eea0dce-cbc6-4dbb-885c-3f353245555d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:48:49 crc kubenswrapper[4745]: I1209 12:48:49.564355 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4vgtf" Dec 09 12:48:49 crc kubenswrapper[4745]: I1209 12:48:49.566016 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4vgtf" event={"ID":"2eea0dce-cbc6-4dbb-885c-3f353245555d","Type":"ContainerDied","Data":"34505240624f4f3b3b00d66122d4bb6605187ec2d5ad23ba72acfac7f996398d"} Dec 09 12:48:49 crc kubenswrapper[4745]: I1209 12:48:49.566060 4745 scope.go:117] "RemoveContainer" containerID="16e0c4c0786b966e7e0a2496d0e84f8fef3ccca97e73aefa7757501f20da701b" Dec 09 12:48:49 crc kubenswrapper[4745]: I1209 12:48:49.594708 4745 scope.go:117] "RemoveContainer" containerID="ce242089acdedead29349039e0bfc345449d06655e9d39721fc504d4b0310c80" Dec 09 12:48:49 crc kubenswrapper[4745]: I1209 12:48:49.628399 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4vgtf"] Dec 09 12:48:49 crc kubenswrapper[4745]: I1209 12:48:49.632267 4745 scope.go:117] "RemoveContainer" containerID="bb5032ecd5ed033f9a9dbf67e20a6ddb309485afe96c2067718c95883ec9f5ab" Dec 09 12:48:49 crc kubenswrapper[4745]: I1209 12:48:49.635266 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4vgtf"] Dec 09 12:48:51 crc kubenswrapper[4745]: I1209 12:48:51.564459 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eea0dce-cbc6-4dbb-885c-3f353245555d" path="/var/lib/kubelet/pods/2eea0dce-cbc6-4dbb-885c-3f353245555d/volumes" Dec 09 12:49:00 crc kubenswrapper[4745]: I1209 12:49:00.554651 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:49:00 crc kubenswrapper[4745]: E1209 12:49:00.555395 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:49:13 crc kubenswrapper[4745]: I1209 12:49:13.559639 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:49:13 crc kubenswrapper[4745]: E1209 12:49:13.560456 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:49:28 crc kubenswrapper[4745]: I1209 12:49:28.554732 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:49:28 crc kubenswrapper[4745]: I1209 12:49:28.842135 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"6fafb25af4ecfb309e5aa3f74a920528b43595dbbe014e6edf6dc2b5c86d70ec"} Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.221041 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fzspp"] Dec 09 12:49:40 crc kubenswrapper[4745]: E1209 12:49:40.223619 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eea0dce-cbc6-4dbb-885c-3f353245555d" containerName="registry-server" Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.223679 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eea0dce-cbc6-4dbb-885c-3f353245555d" containerName="registry-server" Dec 09 12:49:40 crc kubenswrapper[4745]: E1209 12:49:40.223729 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eea0dce-cbc6-4dbb-885c-3f353245555d" containerName="extract-utilities" Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.223738 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eea0dce-cbc6-4dbb-885c-3f353245555d" containerName="extract-utilities" Dec 09 12:49:40 crc kubenswrapper[4745]: E1209 12:49:40.223764 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eea0dce-cbc6-4dbb-885c-3f353245555d" containerName="extract-content" Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.223771 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eea0dce-cbc6-4dbb-885c-3f353245555d" containerName="extract-content" Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.224182 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eea0dce-cbc6-4dbb-885c-3f353245555d" containerName="registry-server" Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.226964 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.244437 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fzspp"] Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.352652 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e239ca-c684-45bd-8bb3-6abef9092950-catalog-content\") pod \"certified-operators-fzspp\" (UID: \"75e239ca-c684-45bd-8bb3-6abef9092950\") " pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.352725 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e239ca-c684-45bd-8bb3-6abef9092950-utilities\") pod \"certified-operators-fzspp\" (UID: \"75e239ca-c684-45bd-8bb3-6abef9092950\") " pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.352975 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2t8h\" (UniqueName: \"kubernetes.io/projected/75e239ca-c684-45bd-8bb3-6abef9092950-kube-api-access-k2t8h\") pod \"certified-operators-fzspp\" (UID: \"75e239ca-c684-45bd-8bb3-6abef9092950\") " pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.454232 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2t8h\" (UniqueName: \"kubernetes.io/projected/75e239ca-c684-45bd-8bb3-6abef9092950-kube-api-access-k2t8h\") pod \"certified-operators-fzspp\" (UID: \"75e239ca-c684-45bd-8bb3-6abef9092950\") " pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.454381 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e239ca-c684-45bd-8bb3-6abef9092950-catalog-content\") pod \"certified-operators-fzspp\" (UID: \"75e239ca-c684-45bd-8bb3-6abef9092950\") " pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.454410 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e239ca-c684-45bd-8bb3-6abef9092950-utilities\") pod \"certified-operators-fzspp\" (UID: \"75e239ca-c684-45bd-8bb3-6abef9092950\") " pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.455047 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e239ca-c684-45bd-8bb3-6abef9092950-utilities\") pod \"certified-operators-fzspp\" (UID: \"75e239ca-c684-45bd-8bb3-6abef9092950\") " pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.455158 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e239ca-c684-45bd-8bb3-6abef9092950-catalog-content\") pod \"certified-operators-fzspp\" (UID: \"75e239ca-c684-45bd-8bb3-6abef9092950\") " pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.474323 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2t8h\" (UniqueName: \"kubernetes.io/projected/75e239ca-c684-45bd-8bb3-6abef9092950-kube-api-access-k2t8h\") pod \"certified-operators-fzspp\" (UID: \"75e239ca-c684-45bd-8bb3-6abef9092950\") " pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.554048 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:40 crc kubenswrapper[4745]: I1209 12:49:40.914408 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fzspp"] Dec 09 12:49:41 crc kubenswrapper[4745]: I1209 12:49:41.931261 4745 generic.go:334] "Generic (PLEG): container finished" podID="75e239ca-c684-45bd-8bb3-6abef9092950" containerID="2f49f3ae378db2d0cd0494e3de0e1f45b06291bafe229fe90bc513584c713a8d" exitCode=0 Dec 09 12:49:41 crc kubenswrapper[4745]: I1209 12:49:41.931376 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzspp" event={"ID":"75e239ca-c684-45bd-8bb3-6abef9092950","Type":"ContainerDied","Data":"2f49f3ae378db2d0cd0494e3de0e1f45b06291bafe229fe90bc513584c713a8d"} Dec 09 12:49:41 crc kubenswrapper[4745]: I1209 12:49:41.931689 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzspp" event={"ID":"75e239ca-c684-45bd-8bb3-6abef9092950","Type":"ContainerStarted","Data":"98e76c1ea68de8d9225170b62710b65638a056155857701a661d83f1af9166b3"} Dec 09 12:49:41 crc kubenswrapper[4745]: I1209 12:49:41.935528 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:49:42 crc kubenswrapper[4745]: I1209 12:49:42.941664 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzspp" event={"ID":"75e239ca-c684-45bd-8bb3-6abef9092950","Type":"ContainerStarted","Data":"14684a3fad2027b80a3dbedfd1bd9c127ccd7c32c6364257dd390921732c86a5"} Dec 09 12:49:43 crc kubenswrapper[4745]: I1209 12:49:43.949688 4745 generic.go:334] "Generic (PLEG): container finished" podID="75e239ca-c684-45bd-8bb3-6abef9092950" containerID="14684a3fad2027b80a3dbedfd1bd9c127ccd7c32c6364257dd390921732c86a5" exitCode=0 Dec 09 12:49:43 crc kubenswrapper[4745]: I1209 12:49:43.949780 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzspp" event={"ID":"75e239ca-c684-45bd-8bb3-6abef9092950","Type":"ContainerDied","Data":"14684a3fad2027b80a3dbedfd1bd9c127ccd7c32c6364257dd390921732c86a5"} Dec 09 12:49:44 crc kubenswrapper[4745]: I1209 12:49:44.959566 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzspp" event={"ID":"75e239ca-c684-45bd-8bb3-6abef9092950","Type":"ContainerStarted","Data":"006723ce4920d52d899bb2248459fd084d6ff13bd91c6d3e5ccdc44feb876a52"} Dec 09 12:49:44 crc kubenswrapper[4745]: I1209 12:49:44.986637 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fzspp" podStartSLOduration=2.61370664 podStartE2EDuration="4.986617616s" podCreationTimestamp="2025-12-09 12:49:40 +0000 UTC" firstStartedPulling="2025-12-09 12:49:41.935232926 +0000 UTC m=+4668.760434450" lastFinishedPulling="2025-12-09 12:49:44.308143902 +0000 UTC m=+4671.133345426" observedRunningTime="2025-12-09 12:49:44.984023566 +0000 UTC m=+4671.809225090" watchObservedRunningTime="2025-12-09 12:49:44.986617616 +0000 UTC m=+4671.811819140" Dec 09 12:49:50 crc kubenswrapper[4745]: I1209 12:49:50.555286 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:50 crc kubenswrapper[4745]: I1209 12:49:50.555886 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:50 crc kubenswrapper[4745]: I1209 12:49:50.596112 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:51 crc kubenswrapper[4745]: I1209 12:49:51.031742 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:51 crc kubenswrapper[4745]: I1209 12:49:51.080540 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fzspp"] Dec 09 12:49:53 crc kubenswrapper[4745]: I1209 12:49:53.007496 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fzspp" podUID="75e239ca-c684-45bd-8bb3-6abef9092950" containerName="registry-server" containerID="cri-o://006723ce4920d52d899bb2248459fd084d6ff13bd91c6d3e5ccdc44feb876a52" gracePeriod=2 Dec 09 12:49:54 crc kubenswrapper[4745]: I1209 12:49:54.820135 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:54 crc kubenswrapper[4745]: I1209 12:49:54.854628 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e239ca-c684-45bd-8bb3-6abef9092950-catalog-content\") pod \"75e239ca-c684-45bd-8bb3-6abef9092950\" (UID: \"75e239ca-c684-45bd-8bb3-6abef9092950\") " Dec 09 12:49:54 crc kubenswrapper[4745]: I1209 12:49:54.854808 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2t8h\" (UniqueName: \"kubernetes.io/projected/75e239ca-c684-45bd-8bb3-6abef9092950-kube-api-access-k2t8h\") pod \"75e239ca-c684-45bd-8bb3-6abef9092950\" (UID: \"75e239ca-c684-45bd-8bb3-6abef9092950\") " Dec 09 12:49:54 crc kubenswrapper[4745]: I1209 12:49:54.854918 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e239ca-c684-45bd-8bb3-6abef9092950-utilities\") pod \"75e239ca-c684-45bd-8bb3-6abef9092950\" (UID: \"75e239ca-c684-45bd-8bb3-6abef9092950\") " Dec 09 12:49:54 crc kubenswrapper[4745]: I1209 12:49:54.855714 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e239ca-c684-45bd-8bb3-6abef9092950-utilities" (OuterVolumeSpecName: "utilities") pod "75e239ca-c684-45bd-8bb3-6abef9092950" (UID: "75e239ca-c684-45bd-8bb3-6abef9092950"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:49:54 crc kubenswrapper[4745]: I1209 12:49:54.861783 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e239ca-c684-45bd-8bb3-6abef9092950-kube-api-access-k2t8h" (OuterVolumeSpecName: "kube-api-access-k2t8h") pod "75e239ca-c684-45bd-8bb3-6abef9092950" (UID: "75e239ca-c684-45bd-8bb3-6abef9092950"). InnerVolumeSpecName "kube-api-access-k2t8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:49:54 crc kubenswrapper[4745]: I1209 12:49:54.908392 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e239ca-c684-45bd-8bb3-6abef9092950-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75e239ca-c684-45bd-8bb3-6abef9092950" (UID: "75e239ca-c684-45bd-8bb3-6abef9092950"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:49:54 crc kubenswrapper[4745]: I1209 12:49:54.956657 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2t8h\" (UniqueName: \"kubernetes.io/projected/75e239ca-c684-45bd-8bb3-6abef9092950-kube-api-access-k2t8h\") on node \"crc\" DevicePath \"\"" Dec 09 12:49:54 crc kubenswrapper[4745]: I1209 12:49:54.956690 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e239ca-c684-45bd-8bb3-6abef9092950-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:49:54 crc kubenswrapper[4745]: I1209 12:49:54.956704 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e239ca-c684-45bd-8bb3-6abef9092950-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:49:55 crc kubenswrapper[4745]: I1209 12:49:55.025357 4745 generic.go:334] "Generic (PLEG): container finished" podID="75e239ca-c684-45bd-8bb3-6abef9092950" containerID="006723ce4920d52d899bb2248459fd084d6ff13bd91c6d3e5ccdc44feb876a52" exitCode=0 Dec 09 12:49:55 crc kubenswrapper[4745]: I1209 12:49:55.025485 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fzspp" Dec 09 12:49:55 crc kubenswrapper[4745]: I1209 12:49:55.025471 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzspp" event={"ID":"75e239ca-c684-45bd-8bb3-6abef9092950","Type":"ContainerDied","Data":"006723ce4920d52d899bb2248459fd084d6ff13bd91c6d3e5ccdc44feb876a52"} Dec 09 12:49:55 crc kubenswrapper[4745]: I1209 12:49:55.025654 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzspp" event={"ID":"75e239ca-c684-45bd-8bb3-6abef9092950","Type":"ContainerDied","Data":"98e76c1ea68de8d9225170b62710b65638a056155857701a661d83f1af9166b3"} Dec 09 12:49:55 crc kubenswrapper[4745]: I1209 12:49:55.025693 4745 scope.go:117] "RemoveContainer" containerID="006723ce4920d52d899bb2248459fd084d6ff13bd91c6d3e5ccdc44feb876a52" Dec 09 12:49:55 crc kubenswrapper[4745]: I1209 12:49:55.044613 4745 scope.go:117] "RemoveContainer" containerID="14684a3fad2027b80a3dbedfd1bd9c127ccd7c32c6364257dd390921732c86a5" Dec 09 12:49:55 crc kubenswrapper[4745]: I1209 12:49:55.059715 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fzspp"] Dec 09 12:49:55 crc kubenswrapper[4745]: I1209 12:49:55.081311 4745 scope.go:117] "RemoveContainer" containerID="2f49f3ae378db2d0cd0494e3de0e1f45b06291bafe229fe90bc513584c713a8d" Dec 09 12:49:55 crc kubenswrapper[4745]: I1209 12:49:55.082677 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fzspp"] Dec 09 12:49:55 crc kubenswrapper[4745]: I1209 12:49:55.103770 4745 scope.go:117] "RemoveContainer" containerID="006723ce4920d52d899bb2248459fd084d6ff13bd91c6d3e5ccdc44feb876a52" Dec 09 12:49:55 crc kubenswrapper[4745]: E1209 12:49:55.104209 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"006723ce4920d52d899bb2248459fd084d6ff13bd91c6d3e5ccdc44feb876a52\": container with ID starting with 006723ce4920d52d899bb2248459fd084d6ff13bd91c6d3e5ccdc44feb876a52 not found: ID does not exist" containerID="006723ce4920d52d899bb2248459fd084d6ff13bd91c6d3e5ccdc44feb876a52" Dec 09 12:49:55 crc kubenswrapper[4745]: I1209 12:49:55.104253 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006723ce4920d52d899bb2248459fd084d6ff13bd91c6d3e5ccdc44feb876a52"} err="failed to get container status \"006723ce4920d52d899bb2248459fd084d6ff13bd91c6d3e5ccdc44feb876a52\": rpc error: code = NotFound desc = could not find container \"006723ce4920d52d899bb2248459fd084d6ff13bd91c6d3e5ccdc44feb876a52\": container with ID starting with 006723ce4920d52d899bb2248459fd084d6ff13bd91c6d3e5ccdc44feb876a52 not found: ID does not exist" Dec 09 12:49:55 crc kubenswrapper[4745]: I1209 12:49:55.104281 4745 scope.go:117] "RemoveContainer" containerID="14684a3fad2027b80a3dbedfd1bd9c127ccd7c32c6364257dd390921732c86a5" Dec 09 12:49:55 crc kubenswrapper[4745]: E1209 12:49:55.104578 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14684a3fad2027b80a3dbedfd1bd9c127ccd7c32c6364257dd390921732c86a5\": container with ID starting with 14684a3fad2027b80a3dbedfd1bd9c127ccd7c32c6364257dd390921732c86a5 not found: ID does not exist" containerID="14684a3fad2027b80a3dbedfd1bd9c127ccd7c32c6364257dd390921732c86a5" Dec 09 12:49:55 crc kubenswrapper[4745]: I1209 12:49:55.104610 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14684a3fad2027b80a3dbedfd1bd9c127ccd7c32c6364257dd390921732c86a5"} err="failed to get container status \"14684a3fad2027b80a3dbedfd1bd9c127ccd7c32c6364257dd390921732c86a5\": rpc error: code = NotFound desc = could not find container \"14684a3fad2027b80a3dbedfd1bd9c127ccd7c32c6364257dd390921732c86a5\": container with ID starting with 14684a3fad2027b80a3dbedfd1bd9c127ccd7c32c6364257dd390921732c86a5 not found: ID does not exist" Dec 09 12:49:55 crc kubenswrapper[4745]: I1209 12:49:55.104631 4745 scope.go:117] "RemoveContainer" containerID="2f49f3ae378db2d0cd0494e3de0e1f45b06291bafe229fe90bc513584c713a8d" Dec 09 12:49:55 crc kubenswrapper[4745]: E1209 12:49:55.104814 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f49f3ae378db2d0cd0494e3de0e1f45b06291bafe229fe90bc513584c713a8d\": container with ID starting with 2f49f3ae378db2d0cd0494e3de0e1f45b06291bafe229fe90bc513584c713a8d not found: ID does not exist" containerID="2f49f3ae378db2d0cd0494e3de0e1f45b06291bafe229fe90bc513584c713a8d" Dec 09 12:49:55 crc kubenswrapper[4745]: I1209 12:49:55.104843 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f49f3ae378db2d0cd0494e3de0e1f45b06291bafe229fe90bc513584c713a8d"} err="failed to get container status \"2f49f3ae378db2d0cd0494e3de0e1f45b06291bafe229fe90bc513584c713a8d\": rpc error: code = NotFound desc = could not find container \"2f49f3ae378db2d0cd0494e3de0e1f45b06291bafe229fe90bc513584c713a8d\": container with ID starting with 2f49f3ae378db2d0cd0494e3de0e1f45b06291bafe229fe90bc513584c713a8d not found: ID does not exist" Dec 09 12:49:55 crc kubenswrapper[4745]: I1209 12:49:55.562815 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e239ca-c684-45bd-8bb3-6abef9092950" path="/var/lib/kubelet/pods/75e239ca-c684-45bd-8bb3-6abef9092950/volumes" Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.192237 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p445t"] Dec 09 12:50:16 crc kubenswrapper[4745]: E1209 12:50:16.193115 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e239ca-c684-45bd-8bb3-6abef9092950" containerName="extract-content" Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.193138 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e239ca-c684-45bd-8bb3-6abef9092950" containerName="extract-content" Dec 09 12:50:16 crc kubenswrapper[4745]: E1209 12:50:16.193167 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e239ca-c684-45bd-8bb3-6abef9092950" containerName="extract-utilities" Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.193180 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e239ca-c684-45bd-8bb3-6abef9092950" containerName="extract-utilities" Dec 09 12:50:16 crc kubenswrapper[4745]: E1209 12:50:16.193210 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e239ca-c684-45bd-8bb3-6abef9092950" containerName="registry-server" Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.193221 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e239ca-c684-45bd-8bb3-6abef9092950" containerName="registry-server" Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.193459 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e239ca-c684-45bd-8bb3-6abef9092950" containerName="registry-server" Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.194887 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.208446 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p445t"] Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.393200 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62e0e433-54b7-4d69-88aa-e1f622397ed1-utilities\") pod \"community-operators-p445t\" (UID: \"62e0e433-54b7-4d69-88aa-e1f622397ed1\") " pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.393263 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62e0e433-54b7-4d69-88aa-e1f622397ed1-catalog-content\") pod \"community-operators-p445t\" (UID: \"62e0e433-54b7-4d69-88aa-e1f622397ed1\") " pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.393576 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wwsb\" (UniqueName: \"kubernetes.io/projected/62e0e433-54b7-4d69-88aa-e1f622397ed1-kube-api-access-2wwsb\") pod \"community-operators-p445t\" (UID: \"62e0e433-54b7-4d69-88aa-e1f622397ed1\") " pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.495576 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wwsb\" (UniqueName: \"kubernetes.io/projected/62e0e433-54b7-4d69-88aa-e1f622397ed1-kube-api-access-2wwsb\") pod \"community-operators-p445t\" (UID: \"62e0e433-54b7-4d69-88aa-e1f622397ed1\") " pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.495723 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62e0e433-54b7-4d69-88aa-e1f622397ed1-utilities\") pod \"community-operators-p445t\" (UID: \"62e0e433-54b7-4d69-88aa-e1f622397ed1\") " pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.495753 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62e0e433-54b7-4d69-88aa-e1f622397ed1-catalog-content\") pod \"community-operators-p445t\" (UID: \"62e0e433-54b7-4d69-88aa-e1f622397ed1\") " pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.496230 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62e0e433-54b7-4d69-88aa-e1f622397ed1-utilities\") pod \"community-operators-p445t\" (UID: \"62e0e433-54b7-4d69-88aa-e1f622397ed1\") " pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.496475 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62e0e433-54b7-4d69-88aa-e1f622397ed1-catalog-content\") pod \"community-operators-p445t\" (UID: \"62e0e433-54b7-4d69-88aa-e1f622397ed1\") " pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.526676 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wwsb\" (UniqueName: \"kubernetes.io/projected/62e0e433-54b7-4d69-88aa-e1f622397ed1-kube-api-access-2wwsb\") pod \"community-operators-p445t\" (UID: \"62e0e433-54b7-4d69-88aa-e1f622397ed1\") " pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:16 crc kubenswrapper[4745]: I1209 12:50:16.812759 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:17 crc kubenswrapper[4745]: I1209 12:50:17.286601 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p445t"] Dec 09 12:50:18 crc kubenswrapper[4745]: I1209 12:50:18.199448 4745 generic.go:334] "Generic (PLEG): container finished" podID="62e0e433-54b7-4d69-88aa-e1f622397ed1" containerID="6a1f5e1909d21023f801da5d6b2a9e80aa611102c86ee9c07d91cdf8995da774" exitCode=0 Dec 09 12:50:18 crc kubenswrapper[4745]: I1209 12:50:18.199602 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p445t" event={"ID":"62e0e433-54b7-4d69-88aa-e1f622397ed1","Type":"ContainerDied","Data":"6a1f5e1909d21023f801da5d6b2a9e80aa611102c86ee9c07d91cdf8995da774"} Dec 09 12:50:18 crc kubenswrapper[4745]: I1209 12:50:18.199711 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p445t" event={"ID":"62e0e433-54b7-4d69-88aa-e1f622397ed1","Type":"ContainerStarted","Data":"04ffcfa0cd08a5f56428ff195ebf9848eb41279a41f061475d457fbc9d9bebb2"} Dec 09 12:50:19 crc kubenswrapper[4745]: I1209 12:50:19.207003 4745 generic.go:334] "Generic (PLEG): container finished" podID="62e0e433-54b7-4d69-88aa-e1f622397ed1" containerID="41011f3529c08b293dbee81103c695711189ae3c96753c6c8daa8a3875060d09" exitCode=0 Dec 09 12:50:19 crc kubenswrapper[4745]: I1209 12:50:19.207094 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p445t" event={"ID":"62e0e433-54b7-4d69-88aa-e1f622397ed1","Type":"ContainerDied","Data":"41011f3529c08b293dbee81103c695711189ae3c96753c6c8daa8a3875060d09"} Dec 09 12:50:20 crc kubenswrapper[4745]: I1209 12:50:20.215974 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p445t" event={"ID":"62e0e433-54b7-4d69-88aa-e1f622397ed1","Type":"ContainerStarted","Data":"5e78bf94d5bfe6b98055405ca5c0b1c8d5ad9b8cfac00a002e99c28c940a2973"} Dec 09 12:50:20 crc kubenswrapper[4745]: I1209 12:50:20.234787 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p445t" podStartSLOduration=2.780523615 podStartE2EDuration="4.234768855s" podCreationTimestamp="2025-12-09 12:50:16 +0000 UTC" firstStartedPulling="2025-12-09 12:50:18.210498495 +0000 UTC m=+4705.035700059" lastFinishedPulling="2025-12-09 12:50:19.664743775 +0000 UTC m=+4706.489945299" observedRunningTime="2025-12-09 12:50:20.232827463 +0000 UTC m=+4707.058028987" watchObservedRunningTime="2025-12-09 12:50:20.234768855 +0000 UTC m=+4707.059970379" Dec 09 12:50:26 crc kubenswrapper[4745]: I1209 12:50:26.814015 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:26 crc kubenswrapper[4745]: I1209 12:50:26.814547 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:26 crc kubenswrapper[4745]: I1209 12:50:26.853991 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:27 crc kubenswrapper[4745]: I1209 12:50:27.326186 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:27 crc kubenswrapper[4745]: I1209 12:50:27.375424 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p445t"] Dec 09 12:50:29 crc kubenswrapper[4745]: I1209 12:50:29.294732 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p445t" podUID="62e0e433-54b7-4d69-88aa-e1f622397ed1" containerName="registry-server" containerID="cri-o://5e78bf94d5bfe6b98055405ca5c0b1c8d5ad9b8cfac00a002e99c28c940a2973" gracePeriod=2 Dec 09 12:50:30 crc kubenswrapper[4745]: I1209 12:50:30.304795 4745 generic.go:334] "Generic (PLEG): container finished" podID="62e0e433-54b7-4d69-88aa-e1f622397ed1" containerID="5e78bf94d5bfe6b98055405ca5c0b1c8d5ad9b8cfac00a002e99c28c940a2973" exitCode=0 Dec 09 12:50:30 crc kubenswrapper[4745]: I1209 12:50:30.304935 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p445t" event={"ID":"62e0e433-54b7-4d69-88aa-e1f622397ed1","Type":"ContainerDied","Data":"5e78bf94d5bfe6b98055405ca5c0b1c8d5ad9b8cfac00a002e99c28c940a2973"} Dec 09 12:50:30 crc kubenswrapper[4745]: I1209 12:50:30.850573 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.026683 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62e0e433-54b7-4d69-88aa-e1f622397ed1-catalog-content\") pod \"62e0e433-54b7-4d69-88aa-e1f622397ed1\" (UID: \"62e0e433-54b7-4d69-88aa-e1f622397ed1\") " Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.026726 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62e0e433-54b7-4d69-88aa-e1f622397ed1-utilities\") pod \"62e0e433-54b7-4d69-88aa-e1f622397ed1\" (UID: \"62e0e433-54b7-4d69-88aa-e1f622397ed1\") " Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.026793 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wwsb\" (UniqueName: \"kubernetes.io/projected/62e0e433-54b7-4d69-88aa-e1f622397ed1-kube-api-access-2wwsb\") pod \"62e0e433-54b7-4d69-88aa-e1f622397ed1\" (UID: \"62e0e433-54b7-4d69-88aa-e1f622397ed1\") " Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.027740 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62e0e433-54b7-4d69-88aa-e1f622397ed1-utilities" (OuterVolumeSpecName: "utilities") pod "62e0e433-54b7-4d69-88aa-e1f622397ed1" (UID: "62e0e433-54b7-4d69-88aa-e1f622397ed1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.034474 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62e0e433-54b7-4d69-88aa-e1f622397ed1-kube-api-access-2wwsb" (OuterVolumeSpecName: "kube-api-access-2wwsb") pod "62e0e433-54b7-4d69-88aa-e1f622397ed1" (UID: "62e0e433-54b7-4d69-88aa-e1f622397ed1"). InnerVolumeSpecName "kube-api-access-2wwsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.078835 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62e0e433-54b7-4d69-88aa-e1f622397ed1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62e0e433-54b7-4d69-88aa-e1f622397ed1" (UID: "62e0e433-54b7-4d69-88aa-e1f622397ed1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.127968 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62e0e433-54b7-4d69-88aa-e1f622397ed1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.128002 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62e0e433-54b7-4d69-88aa-e1f622397ed1-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.128013 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wwsb\" (UniqueName: \"kubernetes.io/projected/62e0e433-54b7-4d69-88aa-e1f622397ed1-kube-api-access-2wwsb\") on node \"crc\" DevicePath \"\"" Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.314745 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p445t" event={"ID":"62e0e433-54b7-4d69-88aa-e1f622397ed1","Type":"ContainerDied","Data":"04ffcfa0cd08a5f56428ff195ebf9848eb41279a41f061475d457fbc9d9bebb2"} Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.314975 4745 scope.go:117] "RemoveContainer" containerID="5e78bf94d5bfe6b98055405ca5c0b1c8d5ad9b8cfac00a002e99c28c940a2973" Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.315026 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p445t" Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.330702 4745 scope.go:117] "RemoveContainer" containerID="41011f3529c08b293dbee81103c695711189ae3c96753c6c8daa8a3875060d09" Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.355002 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p445t"] Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.360931 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p445t"] Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.379794 4745 scope.go:117] "RemoveContainer" containerID="6a1f5e1909d21023f801da5d6b2a9e80aa611102c86ee9c07d91cdf8995da774" Dec 09 12:50:31 crc kubenswrapper[4745]: I1209 12:50:31.564685 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62e0e433-54b7-4d69-88aa-e1f622397ed1" path="/var/lib/kubelet/pods/62e0e433-54b7-4d69-88aa-e1f622397ed1/volumes" Dec 09 12:51:55 crc kubenswrapper[4745]: I1209 12:51:55.475119 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:51:55 crc kubenswrapper[4745]: I1209 12:51:55.475787 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:52:25 crc kubenswrapper[4745]: I1209 12:52:25.475089 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:52:25 crc kubenswrapper[4745]: I1209 12:52:25.475686 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:52:55 crc kubenswrapper[4745]: I1209 12:52:55.475268 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:52:55 crc kubenswrapper[4745]: I1209 12:52:55.475880 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:52:55 crc kubenswrapper[4745]: I1209 12:52:55.475935 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 12:52:55 crc kubenswrapper[4745]: I1209 12:52:55.476585 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6fafb25af4ecfb309e5aa3f74a920528b43595dbbe014e6edf6dc2b5c86d70ec"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:52:55 crc kubenswrapper[4745]: I1209 12:52:55.476655 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://6fafb25af4ecfb309e5aa3f74a920528b43595dbbe014e6edf6dc2b5c86d70ec" gracePeriod=600 Dec 09 12:52:56 crc kubenswrapper[4745]: I1209 12:52:56.347921 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="6fafb25af4ecfb309e5aa3f74a920528b43595dbbe014e6edf6dc2b5c86d70ec" exitCode=0 Dec 09 12:52:56 crc kubenswrapper[4745]: I1209 12:52:56.348020 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"6fafb25af4ecfb309e5aa3f74a920528b43595dbbe014e6edf6dc2b5c86d70ec"} Dec 09 12:52:56 crc kubenswrapper[4745]: I1209 12:52:56.348245 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerStarted","Data":"e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad"} Dec 09 12:52:56 crc kubenswrapper[4745]: I1209 12:52:56.348267 4745 scope.go:117] "RemoveContainer" containerID="dba9ba79ae609fa1800bc2fe47da534a9e82a3949f779bbf7535908332455865" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.171194 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zkfqc"] Dec 09 12:54:20 crc kubenswrapper[4745]: E1209 12:54:20.172311 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e0e433-54b7-4d69-88aa-e1f622397ed1" containerName="registry-server" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.172338 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e0e433-54b7-4d69-88aa-e1f622397ed1" containerName="registry-server" Dec 09 12:54:20 crc kubenswrapper[4745]: E1209 12:54:20.172367 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e0e433-54b7-4d69-88aa-e1f622397ed1" containerName="extract-utilities" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.172380 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e0e433-54b7-4d69-88aa-e1f622397ed1" containerName="extract-utilities" Dec 09 12:54:20 crc kubenswrapper[4745]: E1209 12:54:20.172408 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e0e433-54b7-4d69-88aa-e1f622397ed1" containerName="extract-content" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.172419 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e0e433-54b7-4d69-88aa-e1f622397ed1" containerName="extract-content" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.172836 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="62e0e433-54b7-4d69-88aa-e1f622397ed1" containerName="registry-server" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.174398 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.178562 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zkfqc"] Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.326609 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zclw6\" (UniqueName: \"kubernetes.io/projected/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-kube-api-access-zclw6\") pod \"redhat-marketplace-zkfqc\" (UID: \"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb\") " pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.326719 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-utilities\") pod \"redhat-marketplace-zkfqc\" (UID: \"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb\") " pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.326782 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-catalog-content\") pod \"redhat-marketplace-zkfqc\" (UID: \"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb\") " pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.427881 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zclw6\" (UniqueName: \"kubernetes.io/projected/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-kube-api-access-zclw6\") pod \"redhat-marketplace-zkfqc\" (UID: \"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb\") " pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.427953 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-utilities\") pod \"redhat-marketplace-zkfqc\" (UID: \"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb\") " pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.427980 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-catalog-content\") pod \"redhat-marketplace-zkfqc\" (UID: \"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb\") " pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.428434 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-catalog-content\") pod \"redhat-marketplace-zkfqc\" (UID: \"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb\") " pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.428668 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-utilities\") pod \"redhat-marketplace-zkfqc\" (UID: \"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb\") " pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.448568 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zclw6\" (UniqueName: \"kubernetes.io/projected/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-kube-api-access-zclw6\") pod \"redhat-marketplace-zkfqc\" (UID: \"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb\") " pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.538575 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.956639 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zkfqc"] Dec 09 12:54:20 crc kubenswrapper[4745]: I1209 12:54:20.988544 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkfqc" event={"ID":"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb","Type":"ContainerStarted","Data":"33439b0fcf0f16fa1b5f2d28a91e76dad18b261701b9ca7e24dfcaadb0e4df5a"} Dec 09 12:54:21 crc kubenswrapper[4745]: I1209 12:54:21.997628 4745 generic.go:334] "Generic (PLEG): container finished" podID="0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb" containerID="0a97a48e0f8acf707bcf879a8ea4cab2b24247c7597208fcb8657d6b3f599f7d" exitCode=0 Dec 09 12:54:21 crc kubenswrapper[4745]: I1209 12:54:21.997695 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkfqc" event={"ID":"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb","Type":"ContainerDied","Data":"0a97a48e0f8acf707bcf879a8ea4cab2b24247c7597208fcb8657d6b3f599f7d"} Dec 09 12:54:23 crc kubenswrapper[4745]: I1209 12:54:23.006744 4745 generic.go:334] "Generic (PLEG): container finished" podID="0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb" containerID="d198ed650136a5c37c94b5dc7d43d4b323e6f43b66deb82140fa933f1db38afa" exitCode=0 Dec 09 12:54:23 crc kubenswrapper[4745]: I1209 12:54:23.006825 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkfqc" event={"ID":"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb","Type":"ContainerDied","Data":"d198ed650136a5c37c94b5dc7d43d4b323e6f43b66deb82140fa933f1db38afa"} Dec 09 12:54:24 crc kubenswrapper[4745]: I1209 12:54:24.014747 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkfqc" event={"ID":"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb","Type":"ContainerStarted","Data":"5c215580e2e0b0cf62bc7ac30cef1d5fa2c2a75d0d764e100f830b2efb1a4706"} Dec 09 12:54:24 crc kubenswrapper[4745]: I1209 12:54:24.030634 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zkfqc" podStartSLOduration=2.534894995 podStartE2EDuration="4.030617133s" podCreationTimestamp="2025-12-09 12:54:20 +0000 UTC" firstStartedPulling="2025-12-09 12:54:21.999242311 +0000 UTC m=+4948.824443835" lastFinishedPulling="2025-12-09 12:54:23.494964449 +0000 UTC m=+4950.320165973" observedRunningTime="2025-12-09 12:54:24.029627087 +0000 UTC m=+4950.854828621" watchObservedRunningTime="2025-12-09 12:54:24.030617133 +0000 UTC m=+4950.855818657" Dec 09 12:54:29 crc kubenswrapper[4745]: I1209 12:54:29.207298 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nzmtg/must-gather-5gqgb"] Dec 09 12:54:29 crc kubenswrapper[4745]: I1209 12:54:29.208965 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nzmtg/must-gather-5gqgb" Dec 09 12:54:29 crc kubenswrapper[4745]: I1209 12:54:29.210574 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nzmtg"/"kube-root-ca.crt" Dec 09 12:54:29 crc kubenswrapper[4745]: I1209 12:54:29.210874 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nzmtg"/"openshift-service-ca.crt" Dec 09 12:54:29 crc kubenswrapper[4745]: I1209 12:54:29.218587 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nzmtg/must-gather-5gqgb"] Dec 09 12:54:29 crc kubenswrapper[4745]: I1209 12:54:29.219097 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-nzmtg"/"default-dockercfg-v6cm9" Dec 09 12:54:29 crc kubenswrapper[4745]: I1209 12:54:29.291799 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqdct\" (UniqueName: \"kubernetes.io/projected/655b0c80-f015-45c6-ab25-67eff498ee15-kube-api-access-jqdct\") pod \"must-gather-5gqgb\" (UID: \"655b0c80-f015-45c6-ab25-67eff498ee15\") " pod="openshift-must-gather-nzmtg/must-gather-5gqgb" Dec 09 12:54:29 crc kubenswrapper[4745]: I1209 12:54:29.291886 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/655b0c80-f015-45c6-ab25-67eff498ee15-must-gather-output\") pod \"must-gather-5gqgb\" (UID: \"655b0c80-f015-45c6-ab25-67eff498ee15\") " pod="openshift-must-gather-nzmtg/must-gather-5gqgb" Dec 09 12:54:29 crc kubenswrapper[4745]: I1209 12:54:29.393889 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqdct\" (UniqueName: \"kubernetes.io/projected/655b0c80-f015-45c6-ab25-67eff498ee15-kube-api-access-jqdct\") pod \"must-gather-5gqgb\" (UID: \"655b0c80-f015-45c6-ab25-67eff498ee15\") " pod="openshift-must-gather-nzmtg/must-gather-5gqgb" Dec 09 12:54:29 crc kubenswrapper[4745]: I1209 12:54:29.393997 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/655b0c80-f015-45c6-ab25-67eff498ee15-must-gather-output\") pod \"must-gather-5gqgb\" (UID: \"655b0c80-f015-45c6-ab25-67eff498ee15\") " pod="openshift-must-gather-nzmtg/must-gather-5gqgb" Dec 09 12:54:29 crc kubenswrapper[4745]: I1209 12:54:29.394444 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/655b0c80-f015-45c6-ab25-67eff498ee15-must-gather-output\") pod \"must-gather-5gqgb\" (UID: \"655b0c80-f015-45c6-ab25-67eff498ee15\") " pod="openshift-must-gather-nzmtg/must-gather-5gqgb" Dec 09 12:54:29 crc kubenswrapper[4745]: I1209 12:54:29.413319 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqdct\" (UniqueName: \"kubernetes.io/projected/655b0c80-f015-45c6-ab25-67eff498ee15-kube-api-access-jqdct\") pod \"must-gather-5gqgb\" (UID: \"655b0c80-f015-45c6-ab25-67eff498ee15\") " pod="openshift-must-gather-nzmtg/must-gather-5gqgb" Dec 09 12:54:29 crc kubenswrapper[4745]: I1209 12:54:29.524217 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nzmtg/must-gather-5gqgb" Dec 09 12:54:29 crc kubenswrapper[4745]: I1209 12:54:29.985320 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nzmtg/must-gather-5gqgb"] Dec 09 12:54:30 crc kubenswrapper[4745]: I1209 12:54:30.063583 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nzmtg/must-gather-5gqgb" event={"ID":"655b0c80-f015-45c6-ab25-67eff498ee15","Type":"ContainerStarted","Data":"c310656548443fa3a046a28786ada9058ec2d3cab00b3af4bcad828d00a32bdd"} Dec 09 12:54:30 crc kubenswrapper[4745]: I1209 12:54:30.539374 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:30 crc kubenswrapper[4745]: I1209 12:54:30.539444 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:30 crc kubenswrapper[4745]: I1209 12:54:30.580900 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:31 crc kubenswrapper[4745]: I1209 12:54:31.106938 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:35 crc kubenswrapper[4745]: I1209 12:54:35.756747 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zkfqc"] Dec 09 12:54:35 crc kubenswrapper[4745]: I1209 12:54:35.757011 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zkfqc" podUID="0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb" containerName="registry-server" containerID="cri-o://5c215580e2e0b0cf62bc7ac30cef1d5fa2c2a75d0d764e100f830b2efb1a4706" gracePeriod=2 Dec 09 12:54:36 crc kubenswrapper[4745]: I1209 12:54:36.105171 4745 generic.go:334] "Generic (PLEG): container finished" podID="0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb" containerID="5c215580e2e0b0cf62bc7ac30cef1d5fa2c2a75d0d764e100f830b2efb1a4706" exitCode=0 Dec 09 12:54:36 crc kubenswrapper[4745]: I1209 12:54:36.105239 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkfqc" event={"ID":"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb","Type":"ContainerDied","Data":"5c215580e2e0b0cf62bc7ac30cef1d5fa2c2a75d0d764e100f830b2efb1a4706"} Dec 09 12:54:36 crc kubenswrapper[4745]: I1209 12:54:36.587655 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:36 crc kubenswrapper[4745]: I1209 12:54:36.615587 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-catalog-content\") pod \"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb\" (UID: \"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb\") " Dec 09 12:54:36 crc kubenswrapper[4745]: I1209 12:54:36.616139 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zclw6\" (UniqueName: \"kubernetes.io/projected/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-kube-api-access-zclw6\") pod \"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb\" (UID: \"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb\") " Dec 09 12:54:36 crc kubenswrapper[4745]: I1209 12:54:36.616431 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-utilities\") pod \"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb\" (UID: \"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb\") " Dec 09 12:54:36 crc kubenswrapper[4745]: I1209 12:54:36.618946 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-utilities" (OuterVolumeSpecName: "utilities") pod "0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb" (UID: "0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:54:36 crc kubenswrapper[4745]: I1209 12:54:36.621752 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-kube-api-access-zclw6" (OuterVolumeSpecName: "kube-api-access-zclw6") pod "0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb" (UID: "0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb"). InnerVolumeSpecName "kube-api-access-zclw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:54:36 crc kubenswrapper[4745]: I1209 12:54:36.639278 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb" (UID: "0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:54:36 crc kubenswrapper[4745]: I1209 12:54:36.718456 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:54:36 crc kubenswrapper[4745]: I1209 12:54:36.718527 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:54:36 crc kubenswrapper[4745]: I1209 12:54:36.718542 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zclw6\" (UniqueName: \"kubernetes.io/projected/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb-kube-api-access-zclw6\") on node \"crc\" DevicePath \"\"" Dec 09 12:54:37 crc kubenswrapper[4745]: I1209 12:54:37.113818 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkfqc" event={"ID":"0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb","Type":"ContainerDied","Data":"33439b0fcf0f16fa1b5f2d28a91e76dad18b261701b9ca7e24dfcaadb0e4df5a"} Dec 09 12:54:37 crc kubenswrapper[4745]: I1209 12:54:37.113908 4745 scope.go:117] "RemoveContainer" containerID="5c215580e2e0b0cf62bc7ac30cef1d5fa2c2a75d0d764e100f830b2efb1a4706" Dec 09 12:54:37 crc kubenswrapper[4745]: I1209 12:54:37.113947 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zkfqc" Dec 09 12:54:37 crc kubenswrapper[4745]: I1209 12:54:37.134601 4745 scope.go:117] "RemoveContainer" containerID="d198ed650136a5c37c94b5dc7d43d4b323e6f43b66deb82140fa933f1db38afa" Dec 09 12:54:37 crc kubenswrapper[4745]: I1209 12:54:37.153472 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zkfqc"] Dec 09 12:54:37 crc kubenswrapper[4745]: I1209 12:54:37.156666 4745 scope.go:117] "RemoveContainer" containerID="0a97a48e0f8acf707bcf879a8ea4cab2b24247c7597208fcb8657d6b3f599f7d" Dec 09 12:54:37 crc kubenswrapper[4745]: I1209 12:54:37.161123 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zkfqc"] Dec 09 12:54:37 crc kubenswrapper[4745]: I1209 12:54:37.574273 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb" path="/var/lib/kubelet/pods/0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb/volumes" Dec 09 12:54:38 crc kubenswrapper[4745]: I1209 12:54:38.123812 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nzmtg/must-gather-5gqgb" event={"ID":"655b0c80-f015-45c6-ab25-67eff498ee15","Type":"ContainerStarted","Data":"e0bee869a26196f55038c9601dc201c4d24351da89c530403dc9f1411b85efaa"} Dec 09 12:54:38 crc kubenswrapper[4745]: I1209 12:54:38.124087 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nzmtg/must-gather-5gqgb" event={"ID":"655b0c80-f015-45c6-ab25-67eff498ee15","Type":"ContainerStarted","Data":"de6bbbecc2035d588052b50553a8bc6ed53992c9caa1f1153a3b6d78cf43676e"} Dec 09 12:54:38 crc kubenswrapper[4745]: I1209 12:54:38.143446 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nzmtg/must-gather-5gqgb" podStartSLOduration=2.79682179 podStartE2EDuration="9.143424777s" podCreationTimestamp="2025-12-09 12:54:29 +0000 UTC" firstStartedPulling="2025-12-09 12:54:29.990695343 +0000 UTC m=+4956.815896867" lastFinishedPulling="2025-12-09 12:54:36.33729833 +0000 UTC m=+4963.162499854" observedRunningTime="2025-12-09 12:54:38.138751761 +0000 UTC m=+4964.963953285" watchObservedRunningTime="2025-12-09 12:54:38.143424777 +0000 UTC m=+4964.968626301" Dec 09 12:54:55 crc kubenswrapper[4745]: I1209 12:54:55.475529 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:54:55 crc kubenswrapper[4745]: I1209 12:54:55.476079 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:55:25 crc kubenswrapper[4745]: I1209 12:55:25.476177 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:55:25 crc kubenswrapper[4745]: I1209 12:55:25.477808 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:55:25 crc kubenswrapper[4745]: I1209 12:55:25.981101 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9_40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3/util/0.log" Dec 09 12:55:26 crc kubenswrapper[4745]: I1209 12:55:26.188633 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9_40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3/util/0.log" Dec 09 12:55:26 crc kubenswrapper[4745]: I1209 12:55:26.207470 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9_40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3/pull/0.log" Dec 09 12:55:26 crc kubenswrapper[4745]: I1209 12:55:26.213104 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9_40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3/pull/0.log" Dec 09 12:55:26 crc kubenswrapper[4745]: I1209 12:55:26.341874 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9_40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3/util/0.log" Dec 09 12:55:26 crc kubenswrapper[4745]: I1209 12:55:26.342949 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9_40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3/extract/0.log" Dec 09 12:55:26 crc kubenswrapper[4745]: I1209 12:55:26.363365 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a202a8d0d29caf7438df631524a4272602a9619288fa076173728c3fc3lgnv9_40505d15-7c11-47d5-b8f6-a5a1cc5b8aa3/pull/0.log" Dec 09 12:55:26 crc kubenswrapper[4745]: I1209 12:55:26.540027 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-5wtvn_a8ec2c42-ff99-42c1-b0bb-e362207f4e3e/kube-rbac-proxy/0.log" Dec 09 12:55:26 crc kubenswrapper[4745]: I1209 12:55:26.589666 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-5wtvn_a8ec2c42-ff99-42c1-b0bb-e362207f4e3e/manager/0.log" Dec 09 12:55:26 crc kubenswrapper[4745]: I1209 12:55:26.619794 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-8h2wf_ae7706be-4789-4153-adf4-9abb8e4ee8d8/kube-rbac-proxy/0.log" Dec 09 12:55:26 crc kubenswrapper[4745]: I1209 12:55:26.731890 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-8h2wf_ae7706be-4789-4153-adf4-9abb8e4ee8d8/manager/0.log" Dec 09 12:55:26 crc kubenswrapper[4745]: I1209 12:55:26.741781 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-6vwcq_42df7b5d-8be1-4670-b063-e83cf65b1dae/kube-rbac-proxy/0.log" Dec 09 12:55:26 crc kubenswrapper[4745]: I1209 12:55:26.827420 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-6vwcq_42df7b5d-8be1-4670-b063-e83cf65b1dae/manager/0.log" Dec 09 12:55:26 crc kubenswrapper[4745]: I1209 12:55:26.899210 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-bngxs_668cdfd0-6b54-4750-b58f-97b26180b203/kube-rbac-proxy/0.log" Dec 09 12:55:27 crc kubenswrapper[4745]: I1209 12:55:27.030739 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-bngxs_668cdfd0-6b54-4750-b58f-97b26180b203/manager/0.log" Dec 09 12:55:27 crc kubenswrapper[4745]: I1209 12:55:27.073798 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-ljgwk_b026a92d-e1be-43c9-8e7a-1f66260bab18/manager/0.log" Dec 09 12:55:27 crc kubenswrapper[4745]: I1209 12:55:27.117889 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-ljgwk_b026a92d-e1be-43c9-8e7a-1f66260bab18/kube-rbac-proxy/0.log" Dec 09 12:55:27 crc kubenswrapper[4745]: I1209 12:55:27.256333 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-h44ls_faa047d4-ea12-493a-aa0a-b429e0c5a123/manager/0.log" Dec 09 12:55:27 crc kubenswrapper[4745]: I1209 12:55:27.267964 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-h44ls_faa047d4-ea12-493a-aa0a-b429e0c5a123/kube-rbac-proxy/0.log" Dec 09 12:55:27 crc kubenswrapper[4745]: I1209 12:55:27.427112 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-d2ksl_6368805c-1aab-4425-b599-671f89f30110/kube-rbac-proxy/0.log" Dec 09 12:55:27 crc kubenswrapper[4745]: I1209 12:55:27.561848 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-mmpfw_62cbd8ba-a7e1-40ed-9e2c-8553a84ea6aa/kube-rbac-proxy/0.log" Dec 09 12:55:27 crc kubenswrapper[4745]: I1209 12:55:27.610962 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-mmpfw_62cbd8ba-a7e1-40ed-9e2c-8553a84ea6aa/manager/0.log" Dec 09 12:55:27 crc kubenswrapper[4745]: I1209 12:55:27.634083 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-d2ksl_6368805c-1aab-4425-b599-671f89f30110/manager/0.log" Dec 09 12:55:27 crc kubenswrapper[4745]: I1209 12:55:27.729361 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-cxl42_641a4ff2-1fa2-4402-a313-d27dcc9c4294/kube-rbac-proxy/0.log" Dec 09 12:55:27 crc kubenswrapper[4745]: I1209 12:55:27.836469 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-cxl42_641a4ff2-1fa2-4402-a313-d27dcc9c4294/manager/0.log" Dec 09 12:55:27 crc kubenswrapper[4745]: I1209 12:55:27.890490 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-dxc2l_8b171fb8-d84f-4b45-bbd9-8cf4dfff0a4d/manager/0.log" Dec 09 12:55:27 crc kubenswrapper[4745]: I1209 12:55:27.923277 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-dxc2l_8b171fb8-d84f-4b45-bbd9-8cf4dfff0a4d/kube-rbac-proxy/0.log" Dec 09 12:55:28 crc kubenswrapper[4745]: I1209 12:55:28.010746 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-sch7g_336faca3-d04c-4ab5-b5dd-d9f031c80c64/kube-rbac-proxy/0.log" Dec 09 12:55:28 crc kubenswrapper[4745]: I1209 12:55:28.119436 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-sch7g_336faca3-d04c-4ab5-b5dd-d9f031c80c64/manager/0.log" Dec 09 12:55:28 crc kubenswrapper[4745]: I1209 12:55:28.224770 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-85z5r_cbb6789a-1426-4ea2-aa2b-76959271ffc2/kube-rbac-proxy/0.log" Dec 09 12:55:28 crc kubenswrapper[4745]: I1209 12:55:28.259162 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-85z5r_cbb6789a-1426-4ea2-aa2b-76959271ffc2/manager/0.log" Dec 09 12:55:28 crc kubenswrapper[4745]: I1209 12:55:28.366238 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-662j9_6354cdc2-296f-4479-a46e-9f2be37c4eef/kube-rbac-proxy/0.log" Dec 09 12:55:28 crc kubenswrapper[4745]: I1209 12:55:28.480533 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-662j9_6354cdc2-296f-4479-a46e-9f2be37c4eef/manager/0.log" Dec 09 12:55:28 crc kubenswrapper[4745]: I1209 12:55:28.521387 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-2tl6s_3e400982-edc3-462a-8917-5857ff6dd61e/kube-rbac-proxy/0.log" Dec 09 12:55:28 crc kubenswrapper[4745]: I1209 12:55:28.571331 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-2tl6s_3e400982-edc3-462a-8917-5857ff6dd61e/manager/0.log" Dec 09 12:55:28 crc kubenswrapper[4745]: I1209 12:55:28.685386 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47_5dcc5255-fdaa-463a-960c-5bc89c469a25/kube-rbac-proxy/0.log" Dec 09 12:55:28 crc kubenswrapper[4745]: I1209 12:55:28.710059 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-694d6cfbd6cmx47_5dcc5255-fdaa-463a-960c-5bc89c469a25/manager/0.log" Dec 09 12:55:29 crc kubenswrapper[4745]: I1209 12:55:29.189955 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7979d445b4-rz2zf_5c31507c-4f73-4d7d-85b5-45bd562ca0f3/operator/0.log" Dec 09 12:55:29 crc kubenswrapper[4745]: I1209 12:55:29.323963 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-klw9s_58788af8-97b1-4820-82dc-9ced93c8d7ce/registry-server/0.log" Dec 09 12:55:29 crc kubenswrapper[4745]: I1209 12:55:29.407357 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-jw6tj_23db7946-4921-4ac3-aea8-55abc4c4ba1c/kube-rbac-proxy/0.log" Dec 09 12:55:29 crc kubenswrapper[4745]: I1209 12:55:29.497717 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-668858c49-7ffqt_4fb78653-78a7-4840-8bd4-1ac08145a845/manager/0.log" Dec 09 12:55:29 crc kubenswrapper[4745]: I1209 12:55:29.574939 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-jw6tj_23db7946-4921-4ac3-aea8-55abc4c4ba1c/manager/0.log" Dec 09 12:55:29 crc kubenswrapper[4745]: I1209 12:55:29.631896 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-7jfkb_97f9d87f-9f72-452a-ba66-29c916324b43/kube-rbac-proxy/0.log" Dec 09 12:55:29 crc kubenswrapper[4745]: I1209 12:55:29.710748 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-7jfkb_97f9d87f-9f72-452a-ba66-29c916324b43/manager/0.log" Dec 09 12:55:29 crc kubenswrapper[4745]: I1209 12:55:29.744917 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-klghd_cbee1192-ad85-44db-9457-df5de60b7047/operator/0.log" Dec 09 12:55:29 crc kubenswrapper[4745]: I1209 12:55:29.904878 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-dfwwk_8221e85a-690c-4344-9bd7-5adc0e40b513/manager/0.log" Dec 09 12:55:29 crc kubenswrapper[4745]: I1209 12:55:29.919372 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-dfwwk_8221e85a-690c-4344-9bd7-5adc0e40b513/kube-rbac-proxy/0.log" Dec 09 12:55:29 crc kubenswrapper[4745]: I1209 12:55:29.950821 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-f582k_6267400f-bf32-4ce3-b875-831cee43fc17/kube-rbac-proxy/0.log" Dec 09 12:55:30 crc kubenswrapper[4745]: I1209 12:55:30.041612 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-f582k_6267400f-bf32-4ce3-b875-831cee43fc17/manager/0.log" Dec 09 12:55:30 crc kubenswrapper[4745]: I1209 12:55:30.115433 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-np88r_40fc90a5-6278-4b54-8260-728265a2501a/manager/0.log" Dec 09 12:55:30 crc kubenswrapper[4745]: I1209 12:55:30.156402 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-np88r_40fc90a5-6278-4b54-8260-728265a2501a/kube-rbac-proxy/0.log" Dec 09 12:55:30 crc kubenswrapper[4745]: I1209 12:55:30.233400 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-4q2bv_e98864fd-e07e-4f9f-a9c1-b55c69a26922/kube-rbac-proxy/0.log" Dec 09 12:55:30 crc kubenswrapper[4745]: I1209 12:55:30.294378 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-4q2bv_e98864fd-e07e-4f9f-a9c1-b55c69a26922/manager/0.log" Dec 09 12:55:48 crc kubenswrapper[4745]: I1209 12:55:48.530196 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-nnh4w_e697eb94-a732-4bb6-90c2-cd97e857b60b/control-plane-machine-set-operator/0.log" Dec 09 12:55:48 crc kubenswrapper[4745]: I1209 12:55:48.660361 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-zkn7h_33da92e8-30d5-47b4-9d6a-496d4d1d1306/kube-rbac-proxy/0.log" Dec 09 12:55:48 crc kubenswrapper[4745]: I1209 12:55:48.675764 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-zkn7h_33da92e8-30d5-47b4-9d6a-496d4d1d1306/machine-api-operator/0.log" Dec 09 12:55:55 crc kubenswrapper[4745]: I1209 12:55:55.476029 4745 patch_prober.go:28] interesting pod/machine-config-daemon-bc7sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 12:55:55 crc kubenswrapper[4745]: I1209 12:55:55.476761 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 12:55:55 crc kubenswrapper[4745]: I1209 12:55:55.476824 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" Dec 09 12:55:55 crc kubenswrapper[4745]: I1209 12:55:55.477656 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad"} pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 12:55:55 crc kubenswrapper[4745]: I1209 12:55:55.477741 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerName="machine-config-daemon" containerID="cri-o://e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" gracePeriod=600 Dec 09 12:55:56 crc kubenswrapper[4745]: I1209 12:55:56.106575 4745 generic.go:334] "Generic (PLEG): container finished" podID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" exitCode=0 Dec 09 12:55:56 crc kubenswrapper[4745]: I1209 12:55:56.106667 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" event={"ID":"a9dc9202-9b7e-4a17-a80f-db9338f17cd7","Type":"ContainerDied","Data":"e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad"} Dec 09 12:55:56 crc kubenswrapper[4745]: I1209 12:55:56.107239 4745 scope.go:117] "RemoveContainer" containerID="6fafb25af4ecfb309e5aa3f74a920528b43595dbbe014e6edf6dc2b5c86d70ec" Dec 09 12:55:56 crc kubenswrapper[4745]: E1209 12:55:56.109285 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:55:57 crc kubenswrapper[4745]: I1209 12:55:57.115054 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:55:57 crc kubenswrapper[4745]: E1209 12:55:57.115357 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:56:00 crc kubenswrapper[4745]: I1209 12:56:00.545576 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-mqgqq_bb38d591-af06-4500-9ee6-8b336ce15761/cert-manager-controller/0.log" Dec 09 12:56:00 crc kubenswrapper[4745]: I1209 12:56:00.731004 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-9nvm2_7fa68a26-0e62-44ac-b591-b28d528fb71a/cert-manager-cainjector/0.log" Dec 09 12:56:00 crc kubenswrapper[4745]: I1209 12:56:00.775433 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-th5c6_bb6481d3-d46e-41a3-8400-caf27b4f3757/cert-manager-webhook/0.log" Dec 09 12:56:11 crc kubenswrapper[4745]: I1209 12:56:11.557238 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:56:11 crc kubenswrapper[4745]: E1209 12:56:11.558256 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:56:13 crc kubenswrapper[4745]: I1209 12:56:13.040872 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-f8fhk_26a2ae6b-2e87-4fc8-bc5e-60e2f605f38c/nmstate-console-plugin/0.log" Dec 09 12:56:13 crc kubenswrapper[4745]: I1209 12:56:13.246609 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-t5h2t_bb9420a5-03fd-46c5-a340-3e09aaf95935/nmstate-handler/0.log" Dec 09 12:56:13 crc kubenswrapper[4745]: I1209 12:56:13.305221 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-bt5cb_8d3b8d0a-0403-4e6b-bb04-1dcd0c7f91cc/nmstate-metrics/0.log" Dec 09 12:56:13 crc kubenswrapper[4745]: I1209 12:56:13.316017 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-bt5cb_8d3b8d0a-0403-4e6b-bb04-1dcd0c7f91cc/kube-rbac-proxy/0.log" Dec 09 12:56:13 crc kubenswrapper[4745]: I1209 12:56:13.458664 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-8pkjt_b673e416-f871-4af2-a7ff-d0265a163ba1/nmstate-operator/0.log" Dec 09 12:56:13 crc kubenswrapper[4745]: I1209 12:56:13.518654 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-vxv69_b21bd812-2c99-405f-8ddf-cbca3e8a7c7c/nmstate-webhook/0.log" Dec 09 12:56:26 crc kubenswrapper[4745]: I1209 12:56:26.554434 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:56:26 crc kubenswrapper[4745]: E1209 12:56:26.555248 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:56:26 crc kubenswrapper[4745]: I1209 12:56:26.623033 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-bj6cc_54802f9b-3130-4b2c-a22c-fdbc95388e66/kube-rbac-proxy/0.log" Dec 09 12:56:26 crc kubenswrapper[4745]: I1209 12:56:26.821899 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/cp-frr-files/0.log" Dec 09 12:56:26 crc kubenswrapper[4745]: I1209 12:56:26.943501 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-bj6cc_54802f9b-3130-4b2c-a22c-fdbc95388e66/controller/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.030026 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/cp-reloader/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.063949 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/cp-metrics/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.094015 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/cp-frr-files/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.119280 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/cp-reloader/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.271351 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/cp-reloader/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.284345 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/cp-metrics/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.289957 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/cp-frr-files/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.290269 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/cp-metrics/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.473276 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/cp-metrics/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.481871 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/cp-reloader/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.493536 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/controller/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.497862 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/cp-frr-files/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.651067 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/frr-metrics/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.676539 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/kube-rbac-proxy/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.692926 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/kube-rbac-proxy-frr/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.842751 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/reloader/0.log" Dec 09 12:56:27 crc kubenswrapper[4745]: I1209 12:56:27.958147 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-fsg2c_c6e40c28-b12b-4e80-80ab-a0d5cf254c9c/frr-k8s-webhook-server/0.log" Dec 09 12:56:28 crc kubenswrapper[4745]: I1209 12:56:28.109317 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-58c594ff54-8s4t4_b2f004c8-bb75-40ea-8377-fd965d8b8efa/manager/0.log" Dec 09 12:56:28 crc kubenswrapper[4745]: I1209 12:56:28.259132 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-69fc6547dc-jm8rr_0d69da0b-6b01-4d22-b874-b43c308d712e/webhook-server/0.log" Dec 09 12:56:28 crc kubenswrapper[4745]: I1209 12:56:28.432240 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wkt54_9ea06c36-f5da-472b-9ad1-5ab4401e89e2/kube-rbac-proxy/0.log" Dec 09 12:56:28 crc kubenswrapper[4745]: I1209 12:56:28.838847 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wkt54_9ea06c36-f5da-472b-9ad1-5ab4401e89e2/speaker/0.log" Dec 09 12:56:29 crc kubenswrapper[4745]: I1209 12:56:29.045037 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2j5gd_3144339c-91b6-4d03-ae8f-d7ba80ba67ae/frr/0.log" Dec 09 12:56:37 crc kubenswrapper[4745]: I1209 12:56:37.555034 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:56:37 crc kubenswrapper[4745]: E1209 12:56:37.555721 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:56:40 crc kubenswrapper[4745]: I1209 12:56:40.580894 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4_ae7d9487-15ef-4298-914c-229ff735554f/util/0.log" Dec 09 12:56:40 crc kubenswrapper[4745]: I1209 12:56:40.692016 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4_ae7d9487-15ef-4298-914c-229ff735554f/util/0.log" Dec 09 12:56:40 crc kubenswrapper[4745]: I1209 12:56:40.766306 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4_ae7d9487-15ef-4298-914c-229ff735554f/pull/0.log" Dec 09 12:56:40 crc kubenswrapper[4745]: I1209 12:56:40.801172 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4_ae7d9487-15ef-4298-914c-229ff735554f/pull/0.log" Dec 09 12:56:40 crc kubenswrapper[4745]: I1209 12:56:40.971593 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4_ae7d9487-15ef-4298-914c-229ff735554f/pull/0.log" Dec 09 12:56:40 crc kubenswrapper[4745]: I1209 12:56:40.983541 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4_ae7d9487-15ef-4298-914c-229ff735554f/util/0.log" Dec 09 12:56:41 crc kubenswrapper[4745]: I1209 12:56:41.399201 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2_4ce7d845-3d2d-409d-be8e-8dd3687cbc4e/util/0.log" Dec 09 12:56:41 crc kubenswrapper[4745]: I1209 12:56:41.399231 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2_4ce7d845-3d2d-409d-be8e-8dd3687cbc4e/util/0.log" Dec 09 12:56:41 crc kubenswrapper[4745]: I1209 12:56:41.400145 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931ahlhp4_ae7d9487-15ef-4298-914c-229ff735554f/extract/0.log" Dec 09 12:56:41 crc kubenswrapper[4745]: I1209 12:56:41.401897 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2_4ce7d845-3d2d-409d-be8e-8dd3687cbc4e/pull/0.log" Dec 09 12:56:41 crc kubenswrapper[4745]: I1209 12:56:41.549569 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2_4ce7d845-3d2d-409d-be8e-8dd3687cbc4e/pull/0.log" Dec 09 12:56:41 crc kubenswrapper[4745]: I1209 12:56:41.687038 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2_4ce7d845-3d2d-409d-be8e-8dd3687cbc4e/util/0.log" Dec 09 12:56:41 crc kubenswrapper[4745]: I1209 12:56:41.691895 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2_4ce7d845-3d2d-409d-be8e-8dd3687cbc4e/pull/0.log" Dec 09 12:56:41 crc kubenswrapper[4745]: I1209 12:56:41.724350 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw4wx2_4ce7d845-3d2d-409d-be8e-8dd3687cbc4e/extract/0.log" Dec 09 12:56:41 crc kubenswrapper[4745]: I1209 12:56:41.866291 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm_51904623-2e7d-4a4d-a614-c916a8039fe9/util/0.log" Dec 09 12:56:42 crc kubenswrapper[4745]: I1209 12:56:42.016670 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm_51904623-2e7d-4a4d-a614-c916a8039fe9/pull/0.log" Dec 09 12:56:42 crc kubenswrapper[4745]: I1209 12:56:42.017875 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm_51904623-2e7d-4a4d-a614-c916a8039fe9/util/0.log" Dec 09 12:56:42 crc kubenswrapper[4745]: I1209 12:56:42.019026 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm_51904623-2e7d-4a4d-a614-c916a8039fe9/pull/0.log" Dec 09 12:56:42 crc kubenswrapper[4745]: I1209 12:56:42.171398 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm_51904623-2e7d-4a4d-a614-c916a8039fe9/pull/0.log" Dec 09 12:56:42 crc kubenswrapper[4745]: I1209 12:56:42.172258 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm_51904623-2e7d-4a4d-a614-c916a8039fe9/util/0.log" Dec 09 12:56:42 crc kubenswrapper[4745]: I1209 12:56:42.211181 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ss2mm_51904623-2e7d-4a4d-a614-c916a8039fe9/extract/0.log" Dec 09 12:56:42 crc kubenswrapper[4745]: I1209 12:56:42.317964 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n655r_bc9c8707-e572-487a-bc95-08a093771e39/extract-utilities/0.log" Dec 09 12:56:42 crc kubenswrapper[4745]: I1209 12:56:42.494443 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n655r_bc9c8707-e572-487a-bc95-08a093771e39/extract-content/0.log" Dec 09 12:56:42 crc kubenswrapper[4745]: I1209 12:56:42.519207 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n655r_bc9c8707-e572-487a-bc95-08a093771e39/extract-utilities/0.log" Dec 09 12:56:42 crc kubenswrapper[4745]: I1209 12:56:42.535238 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n655r_bc9c8707-e572-487a-bc95-08a093771e39/extract-content/0.log" Dec 09 12:56:42 crc kubenswrapper[4745]: I1209 12:56:42.667513 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n655r_bc9c8707-e572-487a-bc95-08a093771e39/extract-utilities/0.log" Dec 09 12:56:42 crc kubenswrapper[4745]: I1209 12:56:42.687558 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n655r_bc9c8707-e572-487a-bc95-08a093771e39/extract-content/0.log" Dec 09 12:56:42 crc kubenswrapper[4745]: I1209 12:56:42.881979 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-prw8f_f725dd4f-d47f-4727-8722-88e91fe593b9/extract-utilities/0.log" Dec 09 12:56:43 crc kubenswrapper[4745]: I1209 12:56:43.085219 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-prw8f_f725dd4f-d47f-4727-8722-88e91fe593b9/extract-content/0.log" Dec 09 12:56:43 crc kubenswrapper[4745]: I1209 12:56:43.096209 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-prw8f_f725dd4f-d47f-4727-8722-88e91fe593b9/extract-utilities/0.log" Dec 09 12:56:43 crc kubenswrapper[4745]: I1209 12:56:43.176579 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-prw8f_f725dd4f-d47f-4727-8722-88e91fe593b9/extract-content/0.log" Dec 09 12:56:43 crc kubenswrapper[4745]: I1209 12:56:43.321258 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-prw8f_f725dd4f-d47f-4727-8722-88e91fe593b9/extract-content/0.log" Dec 09 12:56:43 crc kubenswrapper[4745]: I1209 12:56:43.348004 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n655r_bc9c8707-e572-487a-bc95-08a093771e39/registry-server/0.log" Dec 09 12:56:43 crc kubenswrapper[4745]: I1209 12:56:43.359240 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-prw8f_f725dd4f-d47f-4727-8722-88e91fe593b9/extract-utilities/0.log" Dec 09 12:56:43 crc kubenswrapper[4745]: I1209 12:56:43.624600 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rpddj_8eb53c13-6b59-4685-a3c0-e925a24f1f24/marketplace-operator/0.log" Dec 09 12:56:43 crc kubenswrapper[4745]: I1209 12:56:43.757241 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4zp77_b77426b6-744b-4410-adb5-006a49cf8f1d/extract-utilities/0.log" Dec 09 12:56:43 crc kubenswrapper[4745]: I1209 12:56:43.971581 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4zp77_b77426b6-744b-4410-adb5-006a49cf8f1d/extract-content/0.log" Dec 09 12:56:43 crc kubenswrapper[4745]: I1209 12:56:43.980996 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4zp77_b77426b6-744b-4410-adb5-006a49cf8f1d/extract-utilities/0.log" Dec 09 12:56:44 crc kubenswrapper[4745]: I1209 12:56:44.041546 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4zp77_b77426b6-744b-4410-adb5-006a49cf8f1d/extract-content/0.log" Dec 09 12:56:44 crc kubenswrapper[4745]: I1209 12:56:44.127889 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-prw8f_f725dd4f-d47f-4727-8722-88e91fe593b9/registry-server/0.log" Dec 09 12:56:44 crc kubenswrapper[4745]: I1209 12:56:44.172554 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4zp77_b77426b6-744b-4410-adb5-006a49cf8f1d/extract-utilities/0.log" Dec 09 12:56:44 crc kubenswrapper[4745]: I1209 12:56:44.178761 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4zp77_b77426b6-744b-4410-adb5-006a49cf8f1d/extract-content/0.log" Dec 09 12:56:44 crc kubenswrapper[4745]: I1209 12:56:44.345541 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2dz9r_2f125ec9-331b-4128-85ce-9e03a7c28543/extract-utilities/0.log" Dec 09 12:56:44 crc kubenswrapper[4745]: I1209 12:56:44.403629 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4zp77_b77426b6-744b-4410-adb5-006a49cf8f1d/registry-server/0.log" Dec 09 12:56:44 crc kubenswrapper[4745]: I1209 12:56:44.543592 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2dz9r_2f125ec9-331b-4128-85ce-9e03a7c28543/extract-utilities/0.log" Dec 09 12:56:44 crc kubenswrapper[4745]: I1209 12:56:44.543923 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2dz9r_2f125ec9-331b-4128-85ce-9e03a7c28543/extract-content/0.log" Dec 09 12:56:44 crc kubenswrapper[4745]: I1209 12:56:44.556172 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2dz9r_2f125ec9-331b-4128-85ce-9e03a7c28543/extract-content/0.log" Dec 09 12:56:44 crc kubenswrapper[4745]: I1209 12:56:44.699930 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2dz9r_2f125ec9-331b-4128-85ce-9e03a7c28543/extract-utilities/0.log" Dec 09 12:56:44 crc kubenswrapper[4745]: I1209 12:56:44.741052 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2dz9r_2f125ec9-331b-4128-85ce-9e03a7c28543/extract-content/0.log" Dec 09 12:56:45 crc kubenswrapper[4745]: I1209 12:56:45.030075 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2dz9r_2f125ec9-331b-4128-85ce-9e03a7c28543/registry-server/0.log" Dec 09 12:56:48 crc kubenswrapper[4745]: I1209 12:56:48.554429 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:56:48 crc kubenswrapper[4745]: E1209 12:56:48.554697 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:57:00 crc kubenswrapper[4745]: I1209 12:57:00.555030 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:57:00 crc kubenswrapper[4745]: E1209 12:57:00.556879 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:57:14 crc kubenswrapper[4745]: I1209 12:57:14.555202 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:57:14 crc kubenswrapper[4745]: E1209 12:57:14.556013 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:57:29 crc kubenswrapper[4745]: I1209 12:57:29.555009 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:57:29 crc kubenswrapper[4745]: E1209 12:57:29.555660 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:57:40 crc kubenswrapper[4745]: I1209 12:57:40.555355 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:57:40 crc kubenswrapper[4745]: E1209 12:57:40.556145 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:57:48 crc kubenswrapper[4745]: I1209 12:57:48.990417 4745 generic.go:334] "Generic (PLEG): container finished" podID="655b0c80-f015-45c6-ab25-67eff498ee15" containerID="de6bbbecc2035d588052b50553a8bc6ed53992c9caa1f1153a3b6d78cf43676e" exitCode=0 Dec 09 12:57:48 crc kubenswrapper[4745]: I1209 12:57:48.990588 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nzmtg/must-gather-5gqgb" event={"ID":"655b0c80-f015-45c6-ab25-67eff498ee15","Type":"ContainerDied","Data":"de6bbbecc2035d588052b50553a8bc6ed53992c9caa1f1153a3b6d78cf43676e"} Dec 09 12:57:48 crc kubenswrapper[4745]: I1209 12:57:48.991556 4745 scope.go:117] "RemoveContainer" containerID="de6bbbecc2035d588052b50553a8bc6ed53992c9caa1f1153a3b6d78cf43676e" Dec 09 12:57:49 crc kubenswrapper[4745]: I1209 12:57:49.574632 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nzmtg_must-gather-5gqgb_655b0c80-f015-45c6-ab25-67eff498ee15/gather/0.log" Dec 09 12:57:51 crc kubenswrapper[4745]: I1209 12:57:51.554499 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:57:51 crc kubenswrapper[4745]: E1209 12:57:51.555115 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:57:56 crc kubenswrapper[4745]: I1209 12:57:56.590856 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nzmtg/must-gather-5gqgb"] Dec 09 12:57:56 crc kubenswrapper[4745]: I1209 12:57:56.591580 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-nzmtg/must-gather-5gqgb" podUID="655b0c80-f015-45c6-ab25-67eff498ee15" containerName="copy" containerID="cri-o://e0bee869a26196f55038c9601dc201c4d24351da89c530403dc9f1411b85efaa" gracePeriod=2 Dec 09 12:57:56 crc kubenswrapper[4745]: I1209 12:57:56.598299 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nzmtg/must-gather-5gqgb"] Dec 09 12:57:56 crc kubenswrapper[4745]: I1209 12:57:56.951592 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nzmtg_must-gather-5gqgb_655b0c80-f015-45c6-ab25-67eff498ee15/copy/0.log" Dec 09 12:57:56 crc kubenswrapper[4745]: I1209 12:57:56.952204 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nzmtg/must-gather-5gqgb" Dec 09 12:57:57 crc kubenswrapper[4745]: I1209 12:57:57.049760 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nzmtg_must-gather-5gqgb_655b0c80-f015-45c6-ab25-67eff498ee15/copy/0.log" Dec 09 12:57:57 crc kubenswrapper[4745]: I1209 12:57:57.050254 4745 generic.go:334] "Generic (PLEG): container finished" podID="655b0c80-f015-45c6-ab25-67eff498ee15" containerID="e0bee869a26196f55038c9601dc201c4d24351da89c530403dc9f1411b85efaa" exitCode=143 Dec 09 12:57:57 crc kubenswrapper[4745]: I1209 12:57:57.050307 4745 scope.go:117] "RemoveContainer" containerID="e0bee869a26196f55038c9601dc201c4d24351da89c530403dc9f1411b85efaa" Dec 09 12:57:57 crc kubenswrapper[4745]: I1209 12:57:57.050383 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nzmtg/must-gather-5gqgb" Dec 09 12:57:57 crc kubenswrapper[4745]: I1209 12:57:57.068995 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqdct\" (UniqueName: \"kubernetes.io/projected/655b0c80-f015-45c6-ab25-67eff498ee15-kube-api-access-jqdct\") pod \"655b0c80-f015-45c6-ab25-67eff498ee15\" (UID: \"655b0c80-f015-45c6-ab25-67eff498ee15\") " Dec 09 12:57:57 crc kubenswrapper[4745]: I1209 12:57:57.069068 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/655b0c80-f015-45c6-ab25-67eff498ee15-must-gather-output\") pod \"655b0c80-f015-45c6-ab25-67eff498ee15\" (UID: \"655b0c80-f015-45c6-ab25-67eff498ee15\") " Dec 09 12:57:57 crc kubenswrapper[4745]: I1209 12:57:57.081712 4745 scope.go:117] "RemoveContainer" containerID="de6bbbecc2035d588052b50553a8bc6ed53992c9caa1f1153a3b6d78cf43676e" Dec 09 12:57:57 crc kubenswrapper[4745]: I1209 12:57:57.082799 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/655b0c80-f015-45c6-ab25-67eff498ee15-kube-api-access-jqdct" (OuterVolumeSpecName: "kube-api-access-jqdct") pod "655b0c80-f015-45c6-ab25-67eff498ee15" (UID: "655b0c80-f015-45c6-ab25-67eff498ee15"). InnerVolumeSpecName "kube-api-access-jqdct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:57:57 crc kubenswrapper[4745]: I1209 12:57:57.171477 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqdct\" (UniqueName: \"kubernetes.io/projected/655b0c80-f015-45c6-ab25-67eff498ee15-kube-api-access-jqdct\") on node \"crc\" DevicePath \"\"" Dec 09 12:57:57 crc kubenswrapper[4745]: I1209 12:57:57.176120 4745 scope.go:117] "RemoveContainer" containerID="e0bee869a26196f55038c9601dc201c4d24351da89c530403dc9f1411b85efaa" Dec 09 12:57:57 crc kubenswrapper[4745]: E1209 12:57:57.176756 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0bee869a26196f55038c9601dc201c4d24351da89c530403dc9f1411b85efaa\": container with ID starting with e0bee869a26196f55038c9601dc201c4d24351da89c530403dc9f1411b85efaa not found: ID does not exist" containerID="e0bee869a26196f55038c9601dc201c4d24351da89c530403dc9f1411b85efaa" Dec 09 12:57:57 crc kubenswrapper[4745]: I1209 12:57:57.176788 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bee869a26196f55038c9601dc201c4d24351da89c530403dc9f1411b85efaa"} err="failed to get container status \"e0bee869a26196f55038c9601dc201c4d24351da89c530403dc9f1411b85efaa\": rpc error: code = NotFound desc = could not find container \"e0bee869a26196f55038c9601dc201c4d24351da89c530403dc9f1411b85efaa\": container with ID starting with e0bee869a26196f55038c9601dc201c4d24351da89c530403dc9f1411b85efaa not found: ID does not exist" Dec 09 12:57:57 crc kubenswrapper[4745]: I1209 12:57:57.176809 4745 scope.go:117] "RemoveContainer" containerID="de6bbbecc2035d588052b50553a8bc6ed53992c9caa1f1153a3b6d78cf43676e" Dec 09 12:57:57 crc kubenswrapper[4745]: E1209 12:57:57.177170 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de6bbbecc2035d588052b50553a8bc6ed53992c9caa1f1153a3b6d78cf43676e\": container with ID starting with de6bbbecc2035d588052b50553a8bc6ed53992c9caa1f1153a3b6d78cf43676e not found: ID does not exist" containerID="de6bbbecc2035d588052b50553a8bc6ed53992c9caa1f1153a3b6d78cf43676e" Dec 09 12:57:57 crc kubenswrapper[4745]: I1209 12:57:57.177192 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de6bbbecc2035d588052b50553a8bc6ed53992c9caa1f1153a3b6d78cf43676e"} err="failed to get container status \"de6bbbecc2035d588052b50553a8bc6ed53992c9caa1f1153a3b6d78cf43676e\": rpc error: code = NotFound desc = could not find container \"de6bbbecc2035d588052b50553a8bc6ed53992c9caa1f1153a3b6d78cf43676e\": container with ID starting with de6bbbecc2035d588052b50553a8bc6ed53992c9caa1f1153a3b6d78cf43676e not found: ID does not exist" Dec 09 12:57:57 crc kubenswrapper[4745]: I1209 12:57:57.179668 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/655b0c80-f015-45c6-ab25-67eff498ee15-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "655b0c80-f015-45c6-ab25-67eff498ee15" (UID: "655b0c80-f015-45c6-ab25-67eff498ee15"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:57:57 crc kubenswrapper[4745]: I1209 12:57:57.273023 4745 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/655b0c80-f015-45c6-ab25-67eff498ee15-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 09 12:57:57 crc kubenswrapper[4745]: I1209 12:57:57.564335 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="655b0c80-f015-45c6-ab25-67eff498ee15" path="/var/lib/kubelet/pods/655b0c80-f015-45c6-ab25-67eff498ee15/volumes" Dec 09 12:58:05 crc kubenswrapper[4745]: I1209 12:58:05.555011 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:58:05 crc kubenswrapper[4745]: E1209 12:58:05.555854 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:58:18 crc kubenswrapper[4745]: I1209 12:58:18.556146 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:58:18 crc kubenswrapper[4745]: E1209 12:58:18.557348 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:58:29 crc kubenswrapper[4745]: I1209 12:58:29.554899 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:58:29 crc kubenswrapper[4745]: E1209 12:58:29.556024 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.169801 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vxrhm"] Dec 09 12:58:33 crc kubenswrapper[4745]: E1209 12:58:33.170846 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb" containerName="registry-server" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.170905 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb" containerName="registry-server" Dec 09 12:58:33 crc kubenswrapper[4745]: E1209 12:58:33.170933 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb" containerName="extract-utilities" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.170951 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb" containerName="extract-utilities" Dec 09 12:58:33 crc kubenswrapper[4745]: E1209 12:58:33.170982 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb" containerName="extract-content" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.171002 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb" containerName="extract-content" Dec 09 12:58:33 crc kubenswrapper[4745]: E1209 12:58:33.171060 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="655b0c80-f015-45c6-ab25-67eff498ee15" containerName="gather" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.171079 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="655b0c80-f015-45c6-ab25-67eff498ee15" containerName="gather" Dec 09 12:58:33 crc kubenswrapper[4745]: E1209 12:58:33.171101 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="655b0c80-f015-45c6-ab25-67eff498ee15" containerName="copy" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.171118 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="655b0c80-f015-45c6-ab25-67eff498ee15" containerName="copy" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.171542 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="655b0c80-f015-45c6-ab25-67eff498ee15" containerName="copy" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.171594 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="655b0c80-f015-45c6-ab25-67eff498ee15" containerName="gather" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.171635 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b44aafa-7ea1-4b8d-a0d5-2ec2517edfdb" containerName="registry-server" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.174291 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.182164 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vxrhm"] Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.242072 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-catalog-content\") pod \"redhat-operators-vxrhm\" (UID: \"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54\") " pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.242148 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-utilities\") pod \"redhat-operators-vxrhm\" (UID: \"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54\") " pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.242306 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qqkd\" (UniqueName: \"kubernetes.io/projected/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-kube-api-access-2qqkd\") pod \"redhat-operators-vxrhm\" (UID: \"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54\") " pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.343089 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qqkd\" (UniqueName: \"kubernetes.io/projected/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-kube-api-access-2qqkd\") pod \"redhat-operators-vxrhm\" (UID: \"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54\") " pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.343155 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-catalog-content\") pod \"redhat-operators-vxrhm\" (UID: \"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54\") " pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.343200 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-utilities\") pod \"redhat-operators-vxrhm\" (UID: \"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54\") " pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.343687 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-utilities\") pod \"redhat-operators-vxrhm\" (UID: \"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54\") " pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.343834 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-catalog-content\") pod \"redhat-operators-vxrhm\" (UID: \"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54\") " pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.363477 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qqkd\" (UniqueName: \"kubernetes.io/projected/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-kube-api-access-2qqkd\") pod \"redhat-operators-vxrhm\" (UID: \"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54\") " pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.495143 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:33 crc kubenswrapper[4745]: I1209 12:58:33.965585 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vxrhm"] Dec 09 12:58:34 crc kubenswrapper[4745]: I1209 12:58:34.347907 4745 generic.go:334] "Generic (PLEG): container finished" podID="d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54" containerID="99fdb452cbd42040ee9d5ff404aff111801daa2de625bbba086060d1a593d26b" exitCode=0 Dec 09 12:58:34 crc kubenswrapper[4745]: I1209 12:58:34.348006 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxrhm" event={"ID":"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54","Type":"ContainerDied","Data":"99fdb452cbd42040ee9d5ff404aff111801daa2de625bbba086060d1a593d26b"} Dec 09 12:58:34 crc kubenswrapper[4745]: I1209 12:58:34.348215 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxrhm" event={"ID":"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54","Type":"ContainerStarted","Data":"42a07ac5fe193a47729361c449b0995336e95269dbcfe96d660f8af584b9e8fc"} Dec 09 12:58:34 crc kubenswrapper[4745]: I1209 12:58:34.349840 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 12:58:35 crc kubenswrapper[4745]: I1209 12:58:35.357128 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxrhm" event={"ID":"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54","Type":"ContainerStarted","Data":"454299ae94d9d91f385b12a97ae8b8c501438886bfed876bc15576d83c774b4c"} Dec 09 12:58:36 crc kubenswrapper[4745]: I1209 12:58:36.366235 4745 generic.go:334] "Generic (PLEG): container finished" podID="d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54" containerID="454299ae94d9d91f385b12a97ae8b8c501438886bfed876bc15576d83c774b4c" exitCode=0 Dec 09 12:58:36 crc kubenswrapper[4745]: I1209 12:58:36.366313 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxrhm" event={"ID":"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54","Type":"ContainerDied","Data":"454299ae94d9d91f385b12a97ae8b8c501438886bfed876bc15576d83c774b4c"} Dec 09 12:58:37 crc kubenswrapper[4745]: I1209 12:58:37.376265 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxrhm" event={"ID":"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54","Type":"ContainerStarted","Data":"46151ee5d8f879a11adf438db98f0594256dab923ef0985f0d0c109cab2d1576"} Dec 09 12:58:37 crc kubenswrapper[4745]: I1209 12:58:37.401843 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vxrhm" podStartSLOduration=1.991024235 podStartE2EDuration="4.401821886s" podCreationTimestamp="2025-12-09 12:58:33 +0000 UTC" firstStartedPulling="2025-12-09 12:58:34.34952994 +0000 UTC m=+5201.174731464" lastFinishedPulling="2025-12-09 12:58:36.760327591 +0000 UTC m=+5203.585529115" observedRunningTime="2025-12-09 12:58:37.396054701 +0000 UTC m=+5204.221256225" watchObservedRunningTime="2025-12-09 12:58:37.401821886 +0000 UTC m=+5204.227023410" Dec 09 12:58:43 crc kubenswrapper[4745]: I1209 12:58:43.496188 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:43 crc kubenswrapper[4745]: I1209 12:58:43.496851 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:43 crc kubenswrapper[4745]: I1209 12:58:43.562862 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:44 crc kubenswrapper[4745]: I1209 12:58:44.481246 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:44 crc kubenswrapper[4745]: I1209 12:58:44.535741 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vxrhm"] Dec 09 12:58:44 crc kubenswrapper[4745]: I1209 12:58:44.555226 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:58:44 crc kubenswrapper[4745]: E1209 12:58:44.555527 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:58:46 crc kubenswrapper[4745]: I1209 12:58:46.443866 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vxrhm" podUID="d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54" containerName="registry-server" containerID="cri-o://46151ee5d8f879a11adf438db98f0594256dab923ef0985f0d0c109cab2d1576" gracePeriod=2 Dec 09 12:58:48 crc kubenswrapper[4745]: I1209 12:58:48.461254 4745 generic.go:334] "Generic (PLEG): container finished" podID="d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54" containerID="46151ee5d8f879a11adf438db98f0594256dab923ef0985f0d0c109cab2d1576" exitCode=0 Dec 09 12:58:48 crc kubenswrapper[4745]: I1209 12:58:48.461305 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxrhm" event={"ID":"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54","Type":"ContainerDied","Data":"46151ee5d8f879a11adf438db98f0594256dab923ef0985f0d0c109cab2d1576"} Dec 09 12:58:48 crc kubenswrapper[4745]: I1209 12:58:48.645806 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:48 crc kubenswrapper[4745]: I1209 12:58:48.716161 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-utilities\") pod \"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54\" (UID: \"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54\") " Dec 09 12:58:48 crc kubenswrapper[4745]: I1209 12:58:48.716240 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-catalog-content\") pod \"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54\" (UID: \"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54\") " Dec 09 12:58:48 crc kubenswrapper[4745]: I1209 12:58:48.716281 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qqkd\" (UniqueName: \"kubernetes.io/projected/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-kube-api-access-2qqkd\") pod \"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54\" (UID: \"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54\") " Dec 09 12:58:48 crc kubenswrapper[4745]: I1209 12:58:48.717175 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-utilities" (OuterVolumeSpecName: "utilities") pod "d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54" (UID: "d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:58:48 crc kubenswrapper[4745]: I1209 12:58:48.722839 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-kube-api-access-2qqkd" (OuterVolumeSpecName: "kube-api-access-2qqkd") pod "d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54" (UID: "d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54"). InnerVolumeSpecName "kube-api-access-2qqkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 12:58:48 crc kubenswrapper[4745]: I1209 12:58:48.818372 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 12:58:48 crc kubenswrapper[4745]: I1209 12:58:48.818405 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qqkd\" (UniqueName: \"kubernetes.io/projected/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-kube-api-access-2qqkd\") on node \"crc\" DevicePath \"\"" Dec 09 12:58:48 crc kubenswrapper[4745]: I1209 12:58:48.819968 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54" (UID: "d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 12:58:48 crc kubenswrapper[4745]: I1209 12:58:48.919536 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 12:58:49 crc kubenswrapper[4745]: I1209 12:58:49.472180 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxrhm" event={"ID":"d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54","Type":"ContainerDied","Data":"42a07ac5fe193a47729361c449b0995336e95269dbcfe96d660f8af584b9e8fc"} Dec 09 12:58:49 crc kubenswrapper[4745]: I1209 12:58:49.472307 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxrhm" Dec 09 12:58:49 crc kubenswrapper[4745]: I1209 12:58:49.473058 4745 scope.go:117] "RemoveContainer" containerID="46151ee5d8f879a11adf438db98f0594256dab923ef0985f0d0c109cab2d1576" Dec 09 12:58:49 crc kubenswrapper[4745]: I1209 12:58:49.513173 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vxrhm"] Dec 09 12:58:49 crc kubenswrapper[4745]: I1209 12:58:49.519901 4745 scope.go:117] "RemoveContainer" containerID="454299ae94d9d91f385b12a97ae8b8c501438886bfed876bc15576d83c774b4c" Dec 09 12:58:49 crc kubenswrapper[4745]: I1209 12:58:49.521830 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vxrhm"] Dec 09 12:58:49 crc kubenswrapper[4745]: I1209 12:58:49.540544 4745 scope.go:117] "RemoveContainer" containerID="99fdb452cbd42040ee9d5ff404aff111801daa2de625bbba086060d1a593d26b" Dec 09 12:58:49 crc kubenswrapper[4745]: I1209 12:58:49.565081 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54" path="/var/lib/kubelet/pods/d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54/volumes" Dec 09 12:58:57 crc kubenswrapper[4745]: I1209 12:58:57.556135 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:58:57 crc kubenswrapper[4745]: E1209 12:58:57.556618 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:59:11 crc kubenswrapper[4745]: I1209 12:59:11.555654 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:59:11 crc kubenswrapper[4745]: E1209 12:59:11.556415 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:59:22 crc kubenswrapper[4745]: I1209 12:59:22.555265 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:59:22 crc kubenswrapper[4745]: E1209 12:59:22.555882 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:59:36 crc kubenswrapper[4745]: I1209 12:59:36.554768 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:59:36 crc kubenswrapper[4745]: E1209 12:59:36.555457 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 12:59:51 crc kubenswrapper[4745]: I1209 12:59:51.554837 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 12:59:51 crc kubenswrapper[4745]: E1209 12:59:51.555560 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.149729 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54"] Dec 09 13:00:00 crc kubenswrapper[4745]: E1209 13:00:00.150615 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54" containerName="registry-server" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.150632 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54" containerName="registry-server" Dec 09 13:00:00 crc kubenswrapper[4745]: E1209 13:00:00.150663 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54" containerName="extract-content" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.150673 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54" containerName="extract-content" Dec 09 13:00:00 crc kubenswrapper[4745]: E1209 13:00:00.150691 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54" containerName="extract-utilities" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.150700 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54" containerName="extract-utilities" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.150913 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a02ccf-a1b9-457d-b8c9-c4f83fb1be54" containerName="registry-server" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.151463 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.153715 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.153767 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.158526 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjvmg\" (UniqueName: \"kubernetes.io/projected/ad507423-dda0-43ea-a025-23b7a1c2dfa2-kube-api-access-pjvmg\") pod \"collect-profiles-29421420-gkw54\" (UID: \"ad507423-dda0-43ea-a025-23b7a1c2dfa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.158804 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad507423-dda0-43ea-a025-23b7a1c2dfa2-config-volume\") pod \"collect-profiles-29421420-gkw54\" (UID: \"ad507423-dda0-43ea-a025-23b7a1c2dfa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.158953 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad507423-dda0-43ea-a025-23b7a1c2dfa2-secret-volume\") pod \"collect-profiles-29421420-gkw54\" (UID: \"ad507423-dda0-43ea-a025-23b7a1c2dfa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.162919 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54"] Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.260777 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad507423-dda0-43ea-a025-23b7a1c2dfa2-secret-volume\") pod \"collect-profiles-29421420-gkw54\" (UID: \"ad507423-dda0-43ea-a025-23b7a1c2dfa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.260892 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjvmg\" (UniqueName: \"kubernetes.io/projected/ad507423-dda0-43ea-a025-23b7a1c2dfa2-kube-api-access-pjvmg\") pod \"collect-profiles-29421420-gkw54\" (UID: \"ad507423-dda0-43ea-a025-23b7a1c2dfa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.260932 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad507423-dda0-43ea-a025-23b7a1c2dfa2-config-volume\") pod \"collect-profiles-29421420-gkw54\" (UID: \"ad507423-dda0-43ea-a025-23b7a1c2dfa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.261991 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad507423-dda0-43ea-a025-23b7a1c2dfa2-config-volume\") pod \"collect-profiles-29421420-gkw54\" (UID: \"ad507423-dda0-43ea-a025-23b7a1c2dfa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.267324 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad507423-dda0-43ea-a025-23b7a1c2dfa2-secret-volume\") pod \"collect-profiles-29421420-gkw54\" (UID: \"ad507423-dda0-43ea-a025-23b7a1c2dfa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.278379 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjvmg\" (UniqueName: \"kubernetes.io/projected/ad507423-dda0-43ea-a025-23b7a1c2dfa2-kube-api-access-pjvmg\") pod \"collect-profiles-29421420-gkw54\" (UID: \"ad507423-dda0-43ea-a025-23b7a1c2dfa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.477360 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54" Dec 09 13:00:00 crc kubenswrapper[4745]: I1209 13:00:00.930533 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54"] Dec 09 13:00:01 crc kubenswrapper[4745]: I1209 13:00:01.001032 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54" event={"ID":"ad507423-dda0-43ea-a025-23b7a1c2dfa2","Type":"ContainerStarted","Data":"36a1c91ea621061e1902bf35e31ff75abeb136d10731e09b020f73f7c7fb324f"} Dec 09 13:00:02 crc kubenswrapper[4745]: I1209 13:00:02.008833 4745 generic.go:334] "Generic (PLEG): container finished" podID="ad507423-dda0-43ea-a025-23b7a1c2dfa2" containerID="c72a0598c3238f92112790637a2375e34fadca3db3a9bf59b99b8fc7d6024d1f" exitCode=0 Dec 09 13:00:02 crc kubenswrapper[4745]: I1209 13:00:02.008885 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54" event={"ID":"ad507423-dda0-43ea-a025-23b7a1c2dfa2","Type":"ContainerDied","Data":"c72a0598c3238f92112790637a2375e34fadca3db3a9bf59b99b8fc7d6024d1f"} Dec 09 13:00:02 crc kubenswrapper[4745]: I1209 13:00:02.554957 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 13:00:02 crc kubenswrapper[4745]: E1209 13:00:02.555141 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 13:00:03 crc kubenswrapper[4745]: I1209 13:00:03.338630 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54" Dec 09 13:00:03 crc kubenswrapper[4745]: I1209 13:00:03.506844 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad507423-dda0-43ea-a025-23b7a1c2dfa2-secret-volume\") pod \"ad507423-dda0-43ea-a025-23b7a1c2dfa2\" (UID: \"ad507423-dda0-43ea-a025-23b7a1c2dfa2\") " Dec 09 13:00:03 crc kubenswrapper[4745]: I1209 13:00:03.507008 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad507423-dda0-43ea-a025-23b7a1c2dfa2-config-volume\") pod \"ad507423-dda0-43ea-a025-23b7a1c2dfa2\" (UID: \"ad507423-dda0-43ea-a025-23b7a1c2dfa2\") " Dec 09 13:00:03 crc kubenswrapper[4745]: I1209 13:00:03.507808 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad507423-dda0-43ea-a025-23b7a1c2dfa2-config-volume" (OuterVolumeSpecName: "config-volume") pod "ad507423-dda0-43ea-a025-23b7a1c2dfa2" (UID: "ad507423-dda0-43ea-a025-23b7a1c2dfa2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 13:00:03 crc kubenswrapper[4745]: I1209 13:00:03.508040 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjvmg\" (UniqueName: \"kubernetes.io/projected/ad507423-dda0-43ea-a025-23b7a1c2dfa2-kube-api-access-pjvmg\") pod \"ad507423-dda0-43ea-a025-23b7a1c2dfa2\" (UID: \"ad507423-dda0-43ea-a025-23b7a1c2dfa2\") " Dec 09 13:00:03 crc kubenswrapper[4745]: I1209 13:00:03.508755 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad507423-dda0-43ea-a025-23b7a1c2dfa2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 13:00:03 crc kubenswrapper[4745]: I1209 13:00:03.512214 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad507423-dda0-43ea-a025-23b7a1c2dfa2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ad507423-dda0-43ea-a025-23b7a1c2dfa2" (UID: "ad507423-dda0-43ea-a025-23b7a1c2dfa2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 13:00:03 crc kubenswrapper[4745]: I1209 13:00:03.512938 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad507423-dda0-43ea-a025-23b7a1c2dfa2-kube-api-access-pjvmg" (OuterVolumeSpecName: "kube-api-access-pjvmg") pod "ad507423-dda0-43ea-a025-23b7a1c2dfa2" (UID: "ad507423-dda0-43ea-a025-23b7a1c2dfa2"). InnerVolumeSpecName "kube-api-access-pjvmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 13:00:03 crc kubenswrapper[4745]: I1209 13:00:03.609321 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjvmg\" (UniqueName: \"kubernetes.io/projected/ad507423-dda0-43ea-a025-23b7a1c2dfa2-kube-api-access-pjvmg\") on node \"crc\" DevicePath \"\"" Dec 09 13:00:03 crc kubenswrapper[4745]: I1209 13:00:03.609364 4745 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad507423-dda0-43ea-a025-23b7a1c2dfa2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 13:00:04 crc kubenswrapper[4745]: I1209 13:00:04.026714 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54" event={"ID":"ad507423-dda0-43ea-a025-23b7a1c2dfa2","Type":"ContainerDied","Data":"36a1c91ea621061e1902bf35e31ff75abeb136d10731e09b020f73f7c7fb324f"} Dec 09 13:00:04 crc kubenswrapper[4745]: I1209 13:00:04.026776 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36a1c91ea621061e1902bf35e31ff75abeb136d10731e09b020f73f7c7fb324f" Dec 09 13:00:04 crc kubenswrapper[4745]: I1209 13:00:04.026826 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421420-gkw54" Dec 09 13:00:04 crc kubenswrapper[4745]: I1209 13:00:04.407810 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw"] Dec 09 13:00:04 crc kubenswrapper[4745]: I1209 13:00:04.412383 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421375-w6jdw"] Dec 09 13:00:05 crc kubenswrapper[4745]: I1209 13:00:05.566299 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a423627e-0179-440f-a680-eaee72bdc514" path="/var/lib/kubelet/pods/a423627e-0179-440f-a680-eaee72bdc514/volumes" Dec 09 13:00:06 crc kubenswrapper[4745]: I1209 13:00:06.531052 4745 scope.go:117] "RemoveContainer" containerID="9b79014c7124d9db17e37f09f0b1757a7df280c70480472354afb21289e7606f" Dec 09 13:00:14 crc kubenswrapper[4745]: I1209 13:00:14.555189 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 13:00:14 crc kubenswrapper[4745]: E1209 13:00:14.556026 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 13:00:28 crc kubenswrapper[4745]: I1209 13:00:28.554812 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 13:00:28 crc kubenswrapper[4745]: E1209 13:00:28.555463 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7" Dec 09 13:00:40 crc kubenswrapper[4745]: I1209 13:00:40.555438 4745 scope.go:117] "RemoveContainer" containerID="e349c90b61509f3f5c6a8f562a0846a2c08cb2c2d50bed23ef3cb8e5e97ba0ad" Dec 09 13:00:40 crc kubenswrapper[4745]: E1209 13:00:40.556029 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bc7sx_openshift-machine-config-operator(a9dc9202-9b7e-4a17-a80f-db9338f17cd7)\"" pod="openshift-machine-config-operator/machine-config-daemon-bc7sx" podUID="a9dc9202-9b7e-4a17-a80f-db9338f17cd7"